Radiolab

Navigate
Return Home

Breaking News

Thursday, July 27, 2017 - 10:09 PM

Simon Adler takes us down a technological rabbit hole of strangely contorted faces and words made out of thin air. And a wonderland full of computer scientists, journalists, and digital detectives forces us to rethink even the things we see with our very own eyes. 

Oh, and by the way, we decided to put the dark secrets we learned into action, and unleash this on the internet. 

 

Reported by Simon Adler. Produced by Simon Adler and Annie McEwen.

Special thanks to everyone on the University of Southern California team who helped out with the facial manipulation: Kyle Olszewski, Koki Nagano, Ronald Yu, Yi Zhou, Jaewoo Seo, Shunsuke Saito, and Hao Li. Check out more of their work pinscreen.com

Special thanks also to Matthew Aylett, Supasorn Suwajanakorn, Rachel Axler, Angus Kneale, David Carroll, Amy Pearl and Nick Bilton. You can check out Nick’s latest book, American Kingpin, here.

Support Radiolab by becoming a member today at Radiolab.org/donate.

Guests:

Matthew Aylett, Nick Bilton, Hany Farid, Durin Gleaves, Ira Kemelmacher-Schlizerman, Jon Klein and Steve Seitz

Tags:

More in:

Comments [94]

GS from Yokohama

I saw the Adobe demo last year and my immediate thoughts were on the dangers to democracy. Then I listened this episode and was dumbfounded by Ira Kemelmacher-Shlizerman. How she, as an academic, cannot foresee the dangers associated with this technology, leads me to believe she has been living under a rock since Trump won the republican nomination.

Oct. 19 2017 12:39 AM
bubblebuster

The thing that is scarier than how easy it will soon be to fake videos is how plausible deniability will now become much easier. The claim of fake news about anything inconvenient will be easy, and while you might be able to prove what is real in court, it's too late at that point (if you ever get there).

Oct. 08 2017 03:45 PM
Walter

Buahahahaaa! I feel like my IQ just up 50 to 100 points after watching the fake video and reading the dramatic comments! I at least believe the developers have a loooong way to go to really convince me lol! Thanks for trying guys I'm sure it counts for something.

Oct. 08 2017 02:27 PM
anthony from los angeles

The only thing more scary than the content of the story was the take the female researcher(Ira Kemelmacher-Schlizerman) had on the technology.
at best she overestimates the intelligence of most people.
i dont know if it is just overwhelming belief in humanity or a collosal failure of imagination.

this tech scares me.

Oct. 06 2017 06:10 PM
Nas

This is disturbing, and very creepy. It was disappointing how little thought the female computer scientist (I didn't catch her name) had put into the moral implications of her technology. Tech is progressing so fast and it's not really policed like other sciences and yet it effects the day to day life of almost the entire population.

I may be biased as I am a biologist, but I think these progessions in technology need to be justified to a particular body, just like how experiments need to be justified to a grant body on how they are actually improving life. The obvious difference is in the money between these sciences, so I unfortunately can't see that happening in the future.

Oct. 01 2017 06:24 PM
Johnny from Ventura, CA

Love this show in general but wow this episode is just chock full of blatant sensationalism. Let's break it down. What exactly terrifies you? That everything won't be as it seems? It is frankly more frieghtneing to me that the producers of this show somehow think we are only NOW going to get into truth wars, that we are only now getting into the very tedious business of he said she said. That we have somehow been blissfully receiving only the truth in our broadcasts across all media. To think that only now are we really in danger of not knowing what "the truth" is at a level of ignorance that I can not sit by and not respond to. It's ridiculous. We have been lied to the good old fashioned analog way since time immemorial. Watergate, I did not have sexual relations with that woman, the insane whole mechanics of the politics of war, which are built on deceit and illusions. The fact that banks loan out money that doesn't even exist. You guys think that all of a sudden now we are in a crisis of conscience? I'd wager that "the truth" as we generally define it, factors into about 10% of what we actually get from our news sources. When will we finally see the futility in placing so much trust, expending so much energy, on anything? It's all a facade. Bait and switch, smoke and mirrors, get it through your heads: the world is NOTHING like you've determined it. This is not a reason to panic. This is actually potentially cause for celebration. Perhaps with the inevitable impossibility in determining what the so-called "truth" is will lead to our liberation from having to define it! Perhaps we won't be beholden to some unwritten law of blind subservience. Or is that what you're ultimately so afraid of? That things won't be tangible or concrete anymore. That you'll be left realizing you've been defining and creating your world ineffectually, ignorantly deciding what to hold as truth and what to toss aside as lies. The terror of THAT is real. I'd have appreciated the show much more if the producers were willing to get down to that level for once instead of stirring the damn pot that needs more more stirring, it's stirring itself. There's enough erroneous fear running rampant in our culture without you guys creating more of it. Stop the madness. Do a show about how nothing has ever been what it appeared - about consciousness expansion and the potential for humanity we will have when we abolish our old ignorant views of the world. You're spreading unnecessary unease with shows like this. You're all smart folks, can't you see this perspective as valid? Don't delude yourselves any longer. You've been lied to your whole life. NOW is the time to start determining, for yourself, not based on anything anyone told you, for what is "real." And maybe you'll determine that "reality" isn't all it's cracked up to be. Then you can move beyond your definitions and limited worldview and explore the infitnitude of everything, unbound by societal norms and the status quo.

Oct. 01 2017 03:47 PM
Rita

Why the apostrophe after "it"?

Sep. 29 2017 12:27 PM
Emily Dickinson from Seattle

Smile, you're on Candid Camera!

Sep. 17 2017 01:42 PM
Dustin Currie from Salt Lake City

So this was initially scary. But it's not hard to imagine a simple fix using encryption keys. If this got bad, it wouldn't be hard to use digital certificates issue by a certificate authority to verify the source of some video/audio/text and that the media was still in it's original state. This could work similar to the way your browser verifies that 'chase.com' is mapping to the real JP Morgan Chase website.

Crazy though.

Sep. 11 2017 02:59 AM
J from Lansing

Still need to listen to this one, but wanted to comment on the video - I can definitely tell it is fake on a higher resolution screen, which is a good thing. However, as this tech develops, I have a feeling the quality will get much better and in turn it will be harder for the average person to distinguish a real from a fake.

Sep. 08 2017 11:17 AM
chopinois from 97211

Long time fan, first time commenter. I enjoyed listening to this episode, which at times was chilling in its implications for the reliability of the media.

I did take issue with the notion that the only fix for this problem is technological/scientific in nature. One of the guests said something along the lines of "English majors won't fix this problem" but later said that what needs to happen to help fix it is for people to be more discerning and more literate in their media habits. This is LITERALLY EXACTLY what English studies, and the humanities in general, teaches. So why was he dismissive of the humanities as being helpful, and why didn't you as the producers challenge him on this oft-heard yet deeply problematic position?

I would be very interested in a Radiolab episode that explores the (often overt) bias scientists have agains the humanities while simultaneously presenting their greatest problems as those which require skills taught by the humanities to fix them.

The commonly raised issue of "X technology/concept should not be studied by science because it is too powerful/dangerous for humanity to have access to" is one that scientists hate, I know, and they usually respond either by saying that this issue is not really their concern (or that someone else will do it anyway) OR they rely on a frankly naive faith that humanity will act with wisdom and forbearance when they have access to newly powerful technologies. A cursory education in history will disabuse one of the idea that humans will generally act with wisdom and forbearance, especially in any society and time period where the humanities and the creative arts are marginalized and underfunded. Further, the critical thinking skills, empathy, and imagination necessary to live with wisdom and forbearance are taught in the humanities, the arts, and to some extent the social sciences (and arguably in progressive religious traditions). That scientists often deride these fields while implicitly or explicitly relying on the skills taught only in these fields to rein in the potential catastrophic consequences of their research is, shall we say, a problem, and something well worth investigating!

For a jumping off point, I suggest seeing Neil deGrasse Tyson's tweet and the subsequent comments here:

https://twitter.com/neiltyson/status/904861739329708034

Sep. 07 2017 03:46 PM
Joe Friday from USA

Really scary dystopian story. Let's hope this technology doesn't get out of hand. but it's likely that it already exists with the various spy and Counter Intelligence agencies. Maybe we will need people to serve as Fair Witness in the future like in the Robert Heinlein Stranger in a Strange Land. http://dlkphotography.com/fair-witness/stranger-in-a-strange-land

Sep. 05 2017 05:28 PM
Jessica G

I was at adobe max 2016. The speaker also said that voco audio has watermarking so that it's detectable as a fake. As concerned as I am at the ease of audio editing with the growing epidemic of fake news, please remember that audio engineers and editors have been able to do this since the dawn of audio.

Sep. 04 2017 02:19 PM
Collin from Provo, UT

I think what is most concerning is not that somebody will create fake video and audio. It's what it does to our ability to believe real, unmanipulated recordings. Yes, it's scary that somebody could make Barak Obama or Donald Trump say things they didn't say. But I think it's even scarier how this further erodes our common agreements around truth and evidence. It allows any real recording to be denied and called fake.

Sep. 01 2017 01:05 PM
Ed from South Carolina

I think the scientists behind this are reckless and completely naive to think that just because they know it's fake they presume that others will. The woman with the accent is oblivious to how this could be abused. She doesn't even accept that it can be.

It's too late to turn back now, but the ability to prove that the videos that are coming are fake is completely immaterial. People are willing to believe the merest simple photoshops even when it is completely obvious when photos are faked, and even when they are shown to be fake by their creators. So even if you can analyze the videos to prove they are fake, it does not, and never will matter.

This technology was coming no matter what, so that part, while terrifying, doesn't bother me as much as the fact that their creators are stupid enough to think it doesn't matter. She is as unwilling to accept that someone would think they are fake as the people who believe the fake photoshops are real. Two sides who are just as unrealistic in their thinking as the other.

Sep. 01 2017 10:10 AM
A real person from Toronto

Well it's out of the bottle now and, as with much technological "progress," I doubt it could have been kept in through mere regulation. That being said, it is terrifying technology that needs to be dealt with as carefully as enriched uranium. The fact that the very researcher developing the tech was utterly stumped when asked if she thinks it's dangerous is horrifying. She clearly had not carefully considered the social implications of this-- instead invoking the totally frivolous justification that it could be applied to making her mother Telepresent. Well forgive me for being unsympathetic towards the possibility of a digital recreation of your mother speaking to you with fake facial expressions, when that technology is easily capable of starting a major war. To be so unthoughtful, and to abdicate any moral responsibility by hiding behind the ethos of technological determinism, is cowardly and irresponsible, and gives a bad name to good scientists and engineers everywhere.

Aug. 30 2017 01:16 PM
jonathan de la rosa

I'm sorry, but the social and political implications of this are so horrifying that I can't get behind this.

We already live in a world where fact checking is practically powerless to do anything significant. This will help create an epidemic of fake news, propaganda and disinformation so thoroughly convincing to the already-propagandized that no amount of responsible journalism will be able fix. the resulting cultural and political mess. And for the people who haven't been taken in by political and cultural grifters, the ability to distinguish between true and false will get even more muddled than it already is under conditions where big content producers are increasingly unanswerable to their audiences. And the powers that be will be able to withdraw behind all the incoherence, noise and spectacle and push their agendas, just like they do now.

This needs to be at the very least restricted to creative endeavors, and at worst, outright destroyed and banned.

Aug. 30 2017 03:20 AM
Jonas Minnberg from Stockholm

What is that wonderful music playing during the end credits?

Aug. 30 2017 02:46 AM
Javier Sanchez from Miami

In Steven E. de Souza's screenplay for The Running Man (1987) he envisioned a dystopian 2019 where our protagonist is falsely convicted for a massacre after the footage was manipulated and produced as news.

The problem is not so much that this is doable, but rather the general gullibility of our population who are (systematically?) incapable of critical thinking and horde to random social media claims, even when they link to fairly obvious "fake" websites.

Aug. 27 2017 04:23 PM
Ap from NC

One minor quibble: Early in the episode, someone mentions Toy Story as an example of motion capture. Pixar famously does not use motion capture. Every motion in their movies is determined by the animators. Lord of the Rings or the recent Planet of the Apes films would be better examples.

Aug. 25 2017 09:04 PM
Tom Lum from NY

If we want to fake audio of Trump or Obama all you need is someone who can pull off a decent impression... This episode felt like one enormous scapegoat blaming technology from the real issue of fake news... the fabrication of fake news isn't the issue at hand, it's the belief of it, the lack of people seeking out multiple sources and confirmation and critically thinking. There's no holy validity or platonic truth to audio clips or video clips, and there never has been, just as with print.

Aug. 21 2017 10:00 AM
John from Carolina

Dilip, you were correct about their truth right up until they pulled the episode because someone complained about the truth. Now they've edited the truth to fit their audience and become just like all the other fake news organizations. It's shame they didn't stand up for the truth.

And Flip, it concerns me that both sides will be misrepresented.

Aug. 20 2017 09:28 PM
Jennifer

Do you think apps like FaceRig have the ability to capture facial expressions and voices for later use like these digital impressions? Could be that everyone using it is pretty much voluntarily offering up data for use.

Aug. 19 2017 07:24 AM
Angelica Dawn from San Diego

What will happen to voice actors?! Will that job cease to exist with this program? Will you instead have to sell the rights to your voice?

Aug. 18 2017 01:05 AM
Dilip Kondepudi from Winston Salem, North Carolina

Radiolab
I was riveted to my iPhone as I listened to your “Breaking News” episode. Thank you for making your listeners aware of technology that will have great impact on our world. I have the following thoughts I would like to share..

- We are moving into a world in which Truth will become a very valuable commodity. Just as material technology has resulted in the production of processed foods or as the author of “In Defense of Food”, Michael Pollan, calls it “food-like edible substances” or fake food, technology of the digital world has begun to produce recorded-reality-like videos or fake recordings.
In the material world, the consequence of fake food is the growth of groceries like Whole Foods and a growing market for more expensive unprocessed and organic foods, because real food is what we need. It matters. People still enjoy fake food, but most know (and some don’t care) that fake food not healthy. Similarly, in the digital world, radio and TV that can maintain their credibility for reporting facts and unprocessed reality (we understand some editing is essential) will grow in value because truth matters. So programs like yours will become even more valuable and important than they are now, and listeners like me will be willing to make larger contributions than they are now — I certainly am, and I have made my contribution today. I am not worried that the honest journalists will cease to exist, but I know that it will cost more to support them.

Jad, Robert and all who work at Radiolab, there is money to be made. What you produce will grow in value. Don’t lose this opportunity to become rich. Put on your entrepreneurial hats and figure out how!!!!

Aug. 16 2017 10:47 PM
Jeremy Leyland from Earth... but moving

In theory.... could you use this technology, in conjunction with AI to re-create a loved one that died? Like combined with machine learning, video like in star wars, and voco... you could bring someone back... technology, this could be like harry potter portraits kind of thing.

Aug. 16 2017 04:30 PM
Tiffany J. from Orlando, Fl

https://www.instagram.com/p/BXxx5vSjh1a/

I heard this podcast about fake news then saw this a few days later. Lol, cute but the whole idea that this can be placed in the wrong hands....not so comical!

Aug. 16 2017 07:56 AM
Leah from Irvine

This was some seriously belligerent investigating. After the Adobe commercial, you feature a few men and their technologies respectfully, but then aggressively grill the woman technologist and demand this non-American answer for American fake news issues writ large? You actually recorded her umm-ing like that after putting her on the spot, and actually aired it? No other technologist was given this disrespectful attitude or even remotely challenging questions throughout the show, it was painful to listen to.

Of course technologists should be mindful about what their designs engender, but radios were also used to fool the masses -- this is not a new idea or phenomenon. Further, this new technology has plenty of other implications outside American fears of "fake news" that cut across cultures / intersections that y'all don't even bother touching on. This ep was a fear-mongering, lazy, self indulgent, offensive, and v disappointing.

Aug. 15 2017 12:38 AM
Huxley Ford from Panama City FL

Couldn't help but notice... You skipped Barak Obama and went straight for George Bush. Nice... I still love the old episodes, send me an email if you ever come back from the crazy left...

Aug. 13 2017 06:35 PM
Benjamin Reaves from Silicon Valley, CA

I'm quite glad to see this technology available - just like photoshop, it teaches us to be wary of believing what we hear too quickly. This technology seriously works - not only Adobe but other companies and universities not mentioned on Radiolab for example lyrebird.ai (no I won't work for them)

Aug. 10 2017 10:59 PM
Dr Schaefer from Phoenix

I'm not concerned about political operatives making fake videos of Obama or Trump. I'm more concerned about the FBI using something like this so they no longer have to deal directly with Muslims, rather they pick random brown dudes on facebook and make videos of them "confessing" to hatching terrorism plots that the FBI makes up, then the FBI claims credit for stopping these "terrorist events" that were never going to happen in the first place http://www.kansascity.com/news/local/crime/article135871988.html

Aug. 10 2017 04:44 PM
TjS from NYC

Sorry this is off topic but I am curious what the song at 47:13 min is?
Does anybody know?

Thanks!!

Aug. 08 2017 11:45 PM
A R J from WashDC

It could be the future of print media once the electronic media is completely fouled.

Aug. 07 2017 03:42 PM
Fubeman from Washington State

Well, if Jad and Robert (and I love them) actually finished watching the VoCo demo, they would have learned that Adobe is also working on implementing a watermark system to make all audio that was manipulated by VoVo easily detectable before they plan on releasing this. So there's that . . .

P.S. Also, the video that they used for their "Fake" demo was quite horrible and sooooo easily discernible as quite fake. The mouth movements looked like a bd puppet show, the blurring and pixelization around the mouth was horrendous and the timing was off by a mile. Sorry. If the video aspect of this is going to scare me, it is going to have to up its game - quite a bit.

Aug. 06 2017 07:52 PM
Linda from Seattle

I appreciate Radiolab's generous, inquisitive approach to their subject matter -- whatever that subject matter may be. Science fills me with wonder, so it's not too difficult to be awed by a science-focused episode. Others such as the hunting and conservation ep really elevated the discussion and explored the nuances and realities in ways I hadn't considered.

This episode felt a bit foreign to me. I wasn't filled with awe and wonder. I wasn't presented with a contextually rich exploration of a technology. And while I'm certainly accustomed to gotcha, soundbite journalism, Radiolab usually eschews such tired tactics. This wasn't the case with the Ira Kemelmacher-Schlizerman clip that perturbed other commenters. Perhaps she's never considered the ethical implications, but perhaps she simply didn't have a polished, Supreme-Court ready response. Not everyone is articulate when put on the spot or as gifted a communicator as, say, Robert Sapolsky.

I haven't listened to the next episode. The blurb positions it as somewhat redemptive. I look forward to hearing it.

Aug. 06 2017 07:18 PM
DeGroot

It was mention in other comments, that the ability to convincingly alter still photos is a bit of a template for how this may roll out. I've thought for some time now that photo manipulation is one of the factors pushing people to be suspicious of everything, not just photos. People know that photos can be manipulated so were much less trusting of a photo that shows us something we're not quite prepared to believe.

We've now also had years of people pushing the idea that the media is manipulating us and to not believe that other media source.

This developing video technology will only add to people's tendency toward confirmation bias. What is left that can reach across all these entrenched opinions and say, "I am truth?" More and more the lay person is going to go with their gut on what sounds plausible and reach to the media source that confirms this.

We are in deep doodoo as a society.

Aug. 06 2017 11:54 AM
Jad Abumrad from my anxiety.

Freak Out Now!! An evil scientist named Tim Berners-Lee has written a program that let's people publish ANY words that can then be seen anywhere in the world! With paper books you know something is true but with this new system TRUTH WILL DIE!

Aug. 06 2017 11:53 AM
Claire from Washington, DC

I don't this this is cause for too much panic. Humans have long told and written lies...created illusions. We've always believed what we hear and read to the degree we trust our source. We would benefit most from developing more powerful, objective ways to determine the trusworthiness of the source. The trick is developing an algorithm that isn't (and isn't perceived as) politicized. We're already suffering for the lack of such a thing.

Aug. 06 2017 11:34 AM
L from MI

I am incredibly disappointed in you guys for this one. The futureoffakenews clip is laughably bad - the inconsistent audio, the cartoonish facial movements, the stock camera flash sound effects, everything. I cannot honestly believe anyone found it convincing. Based on their reaction, I'm seriously questioning the judgment (or, sad to say, the integrity) of Jad and Robert here.

Aug. 05 2017 11:53 PM
Sylwia

This technology brings a new level to the ability to people to catfish others online by deceiving someone they are speaking with someone else via video, giving a greater false sense of security.

Aug. 05 2017 11:32 AM
Paul Clarke from Evanston, IL

Fake news is, and will always be about trust, not technology.

Methods will be developed to tell if a video or audio clip has been falsified, and even if these are expensive, you will always be able to check with credible news organizations for their take on a story you see online. The real problem is getting people to trust the people with the truth, and not the fakers.

Aug. 04 2017 06:42 PM
Chuck B. Ans from Spokane

I like a story on fake news where the interviewer pretends to be surprised by the things he's learning from the person he is interviewing.

Aug. 04 2017 02:43 PM
Gabriel from Mexico

The disregard for the societal impact on the part of the scientists that are developing these technologies is staggering. They’re just playing with they toy questions and techniques to make them work, but they don’t care at all about what this means for society as a whole. Not to mention their opaque connections to Google and Facebook. I can see why you left that long silence after Dr. Kemelmacher-Shlizerman’s struggling with such a simple question as “Are you afraid of the power of this?” and her response along the lines of “I’m a computer scientist…people should work on that” Why not the scientists in the first place? they are developing these techniques and they should be the first to be aware of what they mean to society.

Aug. 03 2017 12:00 PM
Ben from Parker CO

When you are motivated by fear everything can look like a monster. The most disturbing thing I heard in that story was the new layer of polish getting applied to the jackboots.

Aug. 01 2017 09:33 PM
sleepingtiger

The site referenced in show is
Futureoffakenews.com

Aug. 01 2017 09:32 PM
David from Los Angeles, California

Wow. You guys caught a lot of flak for approaching the story the way you did. I think you did a great job of opening the eyes of non-tech people to this.

We had this very same discussion 30 years ago when Photoshop just came on to the scene. We were able to manipulate picture in Photoshop on a Mac and it blew our mind and wrenched out gut.

For the next 10 years, when you tried to fake a photo, the "customizations" were obvious. Partially because we couldn't scan (no digital photography like today) and work at the resolution necessary to hide the work. There were blurs around the seams, there were lighting and color issues, the final rendering of the image was pretty bad. And, we'd throw our hands up and say "one day".

That day arrived about 5-7 years ago when you could easily work at a 10 Megapixel level - you could take the picture at the resolution, work at that resolution, and then "shrink" the image back down to a reasonable resolution for moving it.

And now, it takes a forensic expert (like your guest) to tell the difference. Perhaps video manipulation will have the same time frame??

Regardless, within those 30 years, the term "Photoshop" became a verb. Not because so many people use the tool - but because everyone knows the ability is there for something you see to not be true.

And once we humans know of a think like that - we will always be ready for it. After all, that's what we humans strive to do - we look for patterns and oddities, explore them, understand them, so that we know to take them into account the next time we encounter them.

When the signal says "Walk", but the light is red. What's the first thing we do? Look around to corroborate. What are other lights telling me? What are the other people and vehicles doing??

When you get a text from your love one that doesn't "feel" right, you corroborate. It's coming from them, and it uses the same words they use. But its just "off" somehow. You call them.

So, yes, it is scary to see that someone can fool me. But now that I know it's there, I won't be fooled the next time I see a video that doesn't sound, look, and ACT like the president.

And maybe that's why you aired the story to begin with.

Aug. 01 2017 05:13 PM
Britten from Nevada

Idiots...why are the smartest folks who make things that everyone know is going to be used for the wrong purpose incompetent. To have the woman scientists just say "duh, don't know" when pressed about how this software is going to be used has to be the dumbest person I know with a IQ of 150+. I hope she take to bed that software designed by her and others have no real good purpose and ultimately will be used just to harm another human...good job.

Aug. 01 2017 04:52 PM
Message in a bottle from Brasilia

I believe that tools like these will continue to evolve, and what really has to change is our minds and the biased ways people tend to see the world. Education will have to make media literacy a priority. We have to start to educate people to think outside the box, or outside the red and blue boxes, and all the other boxes. This is the most challenging mission for educators of the future. It looks that there is a lot of science trying to make advance in what machines can do, but very little is being done to change our minds. There's a lot of irrationality in humanity, we really need to evolve.

Aug. 01 2017 01:04 PM
Danielle Drabeck from Minneapolis, MN

As a scientist I am constantly concerned by the impulse of folks to pose the solution of scientific censorship to the problem of evolving technologies that require ethical and legal changes as well. This approach has never been successful, and never will be successful because science proceeds across the world when technologies become available, whether we censor it or not. Holding back scientific process is like trying to contain a cup of water in a washcloth. The only thing that ever results form this is other groups doing it first (case in point, the nuclear program, the space race, stem cell research, IVF... etc, etc.. ad nauseam).
Let us strive to push for strong and thoughtful programs in philosophy and ethics, and let us also invest in ways to encourage collaboration between thoughtful philosophers of science and ethics and scientists. For the love of god, please let us start recognizing that science outreach to ADULTS and POLITICIANS is important, and will facilitate sound ethical choices. These are real solutions. These are the types of solutions that prepare us for the technologies of the future. The temporary band-aid of censorship in the face of innovation has only ever left societies to face an inevitable new future blinded and unprepared.

Aug. 01 2017 11:59 AM
Devin from Iowa

I was into this story and thinking yeah, that is a concern that people can make these fake videos and pass it off as real, but also thinking that we've kind of already been through this kind of thing--this just makes it a little easier.

The biggest surprise was after seeing the fake video with fake audio. That was awful. Why were the people of Radiolab spooked? Seriously? That was so obviously fake and I find it hard to believe that anyone might think it was real. Real let down. This story kind of seemed like there wasn't any real story here and you guys tried to turn into a story. I really enjoy listening to Radiolab, but this story was pretty disappointing.

Aug. 01 2017 10:18 AM
Chelsea Boyle from Berkeley, CA

At first I was quite anxious after hearing this, but then I thought about all the misinformation and lack of basic critical thought already rampant now. It happens amongst all walks of life, particularly on social media.

I'm a liberal person in an educated liberal bubble, but misguided information gets repeated to me _all the time_. All it takes is one confident-sounding person saying something surprising yet plausible and it gets repeated and rehashed all over the internet. Then the sheer numbers of repeated statements mimic credibility. Very few people ever bother to do the most basic google search on something they've read, especially on social media, even when they don't recognize the source as being credible.

I guess what I'm saying is, using fancy technology to create misinformation is actually almost overkill. Due to a lack of habituated basic critical thinking and healthy scepticism amongst the whole population, all that's actually required are some convincing-sounding words. I assume, though, that curated sources of information will be able to adequately take into account the possibility of falsifyed video and audio, should these technologies become common.

I guess _my_ biggest worry is actually cyber bullying - people using the technology to create defaming videos that are embarrassing, despite known falseness.

Aug. 01 2017 02:51 AM
David from South Carolina

Whatever benefits this provides the world the liabilities will surely outweigh it. But it's out there, and now we'll have to deal with it. Yet another fruitless arms race to occupy human time!

Jul. 31 2017 11:20 PM
Michael from NYC

Of course the prospect of easy, convincing fake videos being misused is alarming.

I think though, that there are a few things to keep in mind. First, the futureoffakenews.com clip is actually pretty crude. Obama's head keeps changing shape like a partially filled water balloon. I also very much doubt that the Voco demo was completely legit -- if it were being used in field conditions on data and edits that had not carefully screened to produce an ideal result, it would likely have sounded more like the robot presidents. It may be a very powerful tool, but I doubt very much it will be as easy to use as the demo made out. These tools are not quite there yet. Doubtless, they will continue to improve.

On the other hand, is puppeted video really a game-changer? As the Radiolabs folks pointed out the gunman whoshowed up at Comet Ping Pong pizza didn't require video, or really any evidence whatsoever, to set him on his course. On a more benign note, an acquaintance of mine recently shared (on social media) a photograph of Ewan McGregor in costume as Obi Wan Kenobi asserting that it was a picture of Jesus. People eager to accept things uncritically will continue to do so, video is just another item to accept or not.

When the time comes that people really can generate convincing fake audio/video with easily accessible tools, we will all need to evaluate the sources, and use our judgement to decide whether to believe it, just like we already have to do with still images and words. We just need to get over our delusion that video is somehow more innately trustworthy than other kinds of media. It never was to begin with --videos are often already altered in less high tech ways, or even simply taken out of context to convey a completely false message.

My feeling is that this is something to look out for, but less worrisome than the widespread state of ignorance that leads so many to accept fantasy as reality.

Jul. 31 2017 10:06 PM
Skep Tic from LA Westsiiiiide

How will I ever believe future Radiolabs after this. Alas, Audio/Video consumption has just become acts of faith, not facts.

Jul. 31 2017 08:11 PM
Paul from Paso Robles

I have no doubt that this technology is coming. There is a lot of money and power at stake here and it's way to tempting to have the ability to sway people to part with their money or to confirm or shape their political views.

No matter what fake news would be generated, the responses could be:

Side A: Our experts have analyzed the video and determined that it is genuine.
Side B: Our experts have analyzed the video and determined that it is fake.

Who believes what?

Even now, it's so easy to fool the masses.

Jul. 31 2017 04:18 PM
Sam from Bonn, Germany

I liked and listened to this podcast because it is an intriguing topic, but this topic is not fresh. There was a conspiracy podcast on Youtube about this months ago. And that was a conspiracy podcast.

Radiolab, I thought, is not a place for conspiracy podcast. Where is the science? More science! Gimme more SCIENCE, not just abuse of technology stories/scare topics.

If you want to go into abuse of technology, consider also the abuse of ignoring technology. Excellent example: the German auto-industry (and other auto companies) and how that effects all of Europe and the world.

I like science. Maybe someone can recommend a more scientific-geared podcast for me?

Jul. 31 2017 10:53 AM
Richard from NC

Listened to this episode this weekend.

Along the lines of VoCo look at what Google is doing with DeepMind: https://deepmind.com/blog/wavenet-generative-model-raw-audio/

Jul. 31 2017 08:11 AM

I'm very relieved. Both the audio segment of modified speech played during the podcast as well as the altered speech by Obama are ridiculously amateur looking. This is highlighted best in the video when Obama says "golf". The sound is clipped and obviously not part of the original. I know several video editors who "sentence mix" much better without the aide of speech recognition software.

The technology will certainly develop over time, but for now there is no real threat in mistaking fake audio/video for what it is. I would certainly like to play with Voco when it is available.

Jul. 31 2017 06:57 AM
John Morgan from Portland

More information does not make it easier to fake news, it makes it harder. Information has been faked countless times for countless purposes through history. It is currently much harder to fake news and information, and it will probably remain harder to fake news and information, than it was when sketches substituted for pictures, written accounts substituted for video, etc.

This counterbalancing fact which is indeed common knowledge to any college educated person found no place in this story.

The way this story may be most alarming, and therefore the way this story may be most engaging, valued, shared, re-listened to, etc. is to ignore contrary information and arguments, and maximize fear of consequences in the listener.

Dedicating some portion of this show to cogent, educated, respected and prepared people with a contrary opinion to all this alarmism, is all it would have taken from the many intelligent and decent people of RadioLab to overcome media bias and help re-establish trust in mass media. But they didn't.

That's what's scary. That so innate is the human drive to grasp the brass ring, our own minds quell pangs of hypocrisy, ambivalence, and doubt. We cannot see, even while recording a show on the future of fake news, the systemic tendencies of competitive media to distort information in pursuit of the biggest story possible.

Jul. 31 2017 01:42 AM
Roi James from Austin, TX

Does anyone really believe that politically biased news organizations would not use this technology to manipulate their audiences? Organizaitons such as FOXnews and Breitbart who apparently intentionally misrepresent facts and proliferate "alternative facts" in support of their political and ideological agendas. Not to mention the truly rogue, and purely vindictive media sources on the internet who seem to want to bring systems down just because they can. This software is an authoritarian state's wet dream. I heard of this tech before the Radiolab segment and felt ovewhelmed at what is coming down the road towards democracy and free societies. I don't know how we are going to be able to combat this when at present, intentionally misleading and untrue fake news stories which are written, not even backed up by video, have alarmed and activated segments of the population so as to divide the nation in two.

What upsets me the most is what appears to be Dr. Kemelmacher-Schlizerman's side-stepping the question of seeing the nefarious uses of the technology. Her evasiveness practically convicts her of the crimes to come. And they WILL come. If she had the courage to really look at that future, she might consider ways to protect the things that are precious in this world from this technology. She says she's building this technology so she can have a conversation with an Avatar mother. Well consider that one day, someone could use that technology to convict her and her mother of a crime they didn't commit by having them say something they never said. The most obvious focus of this tech is on political abuse by authority figures but I see police abuse through manufactured confessions that never took place. How can you say you never committed a crime when we (the police) have a video of you saying you committed the crime? The world is wide and waiting for the myriad forms of abuse this technology can and will take. But oh! I can't wait to talk to my avatar mom! Someone is going to get rich on this tech, regardless of the nations it brings down.

Jul. 30 2017 10:42 PM
chris LM from San Jose

This technology provides a lot additional opportunities for abuse. There is a corresponding technological solution, the incorporation of digital signatures, especially distributed ones like blockchain into recording devices. Trusted news would include the original media and allow for verification of the content. At least in terms of proof against digital manipulation and originating device.

Jul. 30 2017 05:54 PM
Mike from Madrid

I have to question Radiolab's credibility on the concern expressed in the Breaking News podcast. While I certainly agree that audio and video manipulation are serious and creepy threats to society, I consider Radiolab’s editing to be part of the problem. Can you honestly say that editors do not insert dramatic pauses into recorded interviews? Or have Jad re-phrase questions during editing (after the interview) to share your version of the story more directly?

Radiolab clearly has a significant amount post-production editing and sound effects. But on serious stories like Breaking News, this manipulation is not appropriate and can mislead listeners. I, for one, already lost some trust in Radiolab due to the very themes highlighted in Breaking News.

Jul. 30 2017 04:09 PM
Jacqueline

The scariest part of this podcast was Prof. Kemelmacher-Shlizerman's inability to answer basic questions about possible ethical concerns and washing her hands of any moral obligations here. It is not as if she should stop her work but at the minimum reflect on the possible applications and suggest policy or code of ethics that can introduce safeguards.

Jul. 30 2017 11:48 AM
Chris Gurin from Philadelphia, PA

This story arouses way too many inchoate thoughts to compose a sensible narrative,so I'll default to a grocery list of anxiety:

1.The myopic developers of this technology have unleashed a sort of cognitive terrorism we've all been dreading: an arms race of distorted reality and "alternative facts".

2. The greatest disasters in human history always arise from unintended consequences.

3. "When you see something that is technically sweet, you go ahead and do it and you argue about what to do about it only after you have had your technical success. That is the way it was with the atomic bomb."-
J. Robert Oppenheimer

4. The only way to stop something this awful is to unplug EVERYTHING: who wants to go first?

5. I think the doomsday clock has been tracking the wrong extinction event.

Jul. 30 2017 09:04 AM
Keith Duddy from Brisbane, Australia

Clutch Cargo was doing this in the 50s:

https://www.youtube.com/watch?v=UTQI_XjoSSY

...|<

Jul. 30 2017 06:48 AM
Steve from Pittsburgh

This may well be the scariest thing that I've ever heard. At this moment in history, the absolute last thing we need is technology capable of creating artificial spoken word and video of whomever we want saying and doing whatever we want. I don't think our democracy can stand it. I'm especially amazed at how the computer scientist that produced the video technology, Ira, appears to have never considered its possible malicious uses and seems totally unconcerned by them. What are they thinking? This is even more concerning to me than the potential nefarious uses of CRISPR - at least there are some regulatory bodies in science and its use requires some scientific training not possessed by the typical person. There are internet trolls all over the world that are going to have a field day with this. They've opened Pandora's box, and based on this interview, didn't even know it until now? I sincerely hope that they have a plan to regulate this technology because this seems like a "blood on your hands" type of moment.

Radiolab team - can you please do a follow-up with the people making this technology to press them on this point?

Jul. 29 2017 09:11 PM
T.A. Barnhart from Portland, OR

I'm disappointed you missed the most obvious protection, which is also the fastest growing way to get content on the web: livecasting. If there is an event with multiple livecasts – one on FB Live, one on Twitter, etc – and they match one another, we can know what actually happened. If something turns up that doesn't match the livecasts, then we know what's fake.

And people are doing live more & more. It's getting easier & easier. As long as something is livecasted, you'll have that record as your fact check.

And frankly, spreading false content has been done forever. It doesn't take tech; it just takes someone willing to lie & someone to believe that lie.

Jul. 29 2017 07:08 PM
You Tube from Uganda, Africa

This is why we need to stop worrying about global warming or politics, but focus on the explosion of AI and technology. I sincerely predict that by the year 2050, we will be able to affordable be able to upload ourselves into an external battery/simulation.

Jul. 29 2017 06:56 PM
Roman from Russia

Guys, the word/sentence mixing stuff is old news. People in the YTP community have been doing this for years! BY HAND!

Jul. 29 2017 06:39 PM
Galatae

Trying to look for positives, and I can see how some of this technology could be beneficial in healthcare, specifically for people who have catastrophic injuries that would affect their speech. Imagine if Hawking had 40 minutes of speech to turn into his own voice.

That being said. This is how wars are started. Say someone wants North Korea to go after the US and NOW they actually have the wherewithal to do it, what's going to stop them from thinking they have proof of some transgression they can just manufacture? It is no longer a matter of "If" it will happen, it's "When". These are really bizarre times we live in.

Jul. 29 2017 03:07 PM
zach from Raleigh, NC, USA

For a while now "digitally-signed" content has become more and more prevalent.

I am sure the authors of this piece got an ear-full on how digital content can be "watermarked". I do expect that unsigned content on "The-Net" will fast become a thing of the past. I am OK with the authors leaving this out since if they included it then the story would not have the same impact as it does, and I do think this technology should receive wide scrutiny.

Once digital content routinely has a 'watermark' then all social media sites will most likely only carry such content.

And, guess what? We could also extend this concept so you could trace-back everyone who shared/forwarded such content. And you could see just where it 'really' came from.

This will not eliminate "Fake-News" since some channel like The Faux News Network (Faux/Fos sp?) still could make up their own crap but the could NOT put 'words/expressions in/on someone's mouth/face'.

But when every single wad of digital content on the net has its own watermark and audit trail. Then it will be very straightforward to tell fact from fancy.

Jul. 29 2017 12:36 PM
David from Idaho

Ick... This was the first radiolab that I ever shut off half way through. They kept acting like this tech was one day going to used to cause great harm to society. Um... Nope! We're not all idiots thanks. People will know that this type of thing is possible to fake now and it'll only be used for jokes. Went way overboard with the communicating with ghosts thing. The uncanny valley still exists here and it's obvious.

Jul. 29 2017 03:05 AM
Daniel from Los Altos, CA

While the concerns of this piece are understandable, I think it undervalues how much context matters when people evaluate trustworthiness. It's not like this hasn't happened already - photo editing tools create edited images that are virtually indistinguishable from originals. Sure, this has led to some abuse by fake news, as well as increased skepticism that the images we see are real. But for the most part, society evaluates things well - they know when to cast a suspicious eye or call something out as probably Photoshopped, but it hasn't led to wholesale distrust of news photos. People know that images are easily manipulated, but still trust them, recognizing that where they come from matters.

Being able to do the same with audio/video takes this to a different level, but I'm pretty sure the consequence will be the same - people will be more skeptical with videos they see, but will still trust some, depending on the source.

Jul. 29 2017 03:03 AM
John

I'm pretty familiar with this technology, and I can tell you it will only get much worse. Imagine being able to synthesize an image or video of anybody you want saying anything you want, doing anything you want. Easily possible within 5 years. Only hope is some sort of cryptography to verify authenticity. Or maybe smash all the computers? Butlerian jihad!

Jul. 29 2017 02:43 AM
Spence from WV

A different spin on "fake news" from someone in local news: I could see an application once the quality is 100% where local news will use this so your local anchor person can be on multiple streams at once - or weather can be constantly presented by a likeness of your meteorologist so they "never need days off" and can literally be two (or more) places at once.

The cost savings to Broadcasters is immense. A likeness of Wolf Blitzer never needs to leave The Situation Room.

Or better yet - once they fix facial issues - a company can just make "the ultimate news presenter" saving tons of money. Build who you want and they never make anybody upset like Ron Burgundy!

Bad news for people like me: it's going to be easy to be replaced by this technology. Very easy.

It also brings up legal issues - most stations make "talent" sign away rights to their likenesses .... I would want a cut of profit if my "virtual self" was making money for the firm ... and the non compete ramifications are dizzying. What if someone like Lester Holt was to be hired by another network? He has to sit out a period of time - but if his former network had flawless technology they could virtually keep his likeness on air and performing for a long time until his non compete was up.

Think of "commodities" on air like traffic and stock reports. Those are already automated and a person just reads them. Computers kick out full blown stories for newspaper/Web for items like that on the fly- ... now add that text to produce a voice and matching facial image ... all done without human writers - bad for jobs for local media folks I say and a big can of worms.

May be 20 years away but sounds more like 10.

Amazing...

Jul. 29 2017 01:52 AM
Chris B

I really think Jad and Robert should re-examine this issue from a different perspective. Instead of running around like a couple of Chicken Littles, maybe they could look back at how the introduction of new forms of media or media editing technology has been treated in the past. I suspect that they'll find a lot of the same sort sky-is-falling rhetoric about things like photography, film, radio, audio tape, and digital images. These technologies have all changed society in complicated ways. Manipulation of recorded images or sounds have been around as long as we've had such recordings. Nothing they present in the episode (except perhaps that Jordan Peele clip, and that could have been done by any good human imitator) offers solid evidence that this kind of technology will make it harder to tell truth from fiction.

Jul. 28 2017 09:46 PM
Nicole from Miami, FL

That video in no way looked real. Either I'm greatly over-estimating the intelligence of the general public or I'm a hard person to "get one over" on, but that video was horrible! Looked like something a high school computer class could've put together. In no way was I duped into thinking it was really Obama saying that.

Jul. 28 2017 05:22 PM

OMG! I can't believe all those scientists that invented the internet 40 years ago. I'm sure someone was like "Once everyone can broadcast their own thoughts and opinions to millions how will people know the truth!?". "We must ban this technology, only licensed newspapers and TV stations should be able to reach that many people. If the average person has this tech it's the end of the world. Oh, I can't believe those scientists who invented the internet had no ethics. If they had they'd never have made it.

:rolleyes:

Fascinated show but there are MUCH bigger things to worry about than the latest Hollywood fx tech.

Jul. 28 2017 05:02 PM
Daniel T from Hillsborough NC

What I thought was the obvious question that wasn't addressed is why did the VoCo demonstration work so much better than the examples we heard at the end of the episode?

Jul. 28 2017 03:55 PM
Lydia

The female scientist's explanation elides complete responsibility. should could have said I built this nuke... but that's my job as a scientist. How people use it. Education! Educate them about it.

No, maybe don't use your resources and talents to build something with such obvious nefarious uses. You are responsible for releasing it into the world. It's completely self-centered to say everyone must adjust! Everyone must make themselves safe from this. If people misuse it-- that's not my fault.

In essence she is saying 'I am a computer scientist I have no responsibility'. That is a childish view of ethics.

Jul. 28 2017 03:12 PM
Sir Tafticus from Green Bay, Wisconsin

Another thing that must be considered is the issue of accessibility to NEW or expensive technology. I say this because I cannot afford a 4K television set with 48 inch display, but if I did, your experiment video would be RIDICULOUSLY obvious as a fake. It's not so much about your experiment video, but about how any imperfection in the audio and video is more easily recognizable on a display that is crystal clear, with a speaker that is high definition. So demographics, wealth, and accessibility factor into how easily I can be fooled.

Jul. 28 2017 02:20 PM
Jakob Gowell from Rhinebeck, NY

Once the technology has advanced to generating novel voices not belonging to any particular extant human, Radiolab could use it to generate the audio for the credits without needing to have anybody call in... But what fun would that be?

Jul. 28 2017 01:53 PM
David from San Rafael

The word that everybody danced around but nobody used is "ethics."

Jul. 28 2017 01:33 PM
Greg from Denver

Probably as disconcerting of the technology was the attitude of the researcher helping to develop it. It was plainly obvious that she knew of the ramifications but preferred to handle her cognitive dissonance by stating "she was just a computer scientist".

I also noted the increasingly intertwined relationship between academicians and commercial companies which probably helps to further could the interest of these researchers.

Jul. 28 2017 01:15 PM
Podcast Junkie from AlbuKwerky, NM

Well thanks for that horrifying story.

Science and tech research is like Pandora's box. Once the lid is open, there is no turning back. My naive hope is that this will force people to stop being mentally lazy and actually consider the source and question the intent of the distributor.

Jul. 28 2017 12:40 PM
Aaron from St. George, Utah

WRT my prior comment--please forgive the voice-to-text errors or fat-finger phone keyboard typos I missed while editing that comment on my phone (i.e. rock => raw). And let me add that video and audio Providence will become more important in the future, and perhaps the phrase,"Show me the [block]chain!" will become the mantra of those desiring proof that a recording is authentic.

-Aaron
St.George, Utah

Jul. 28 2017 12:27 PM
Aaron from St. George, Utah

You may want to look into the intersection of block chain technology with developments in audio and video recording equipment. There are those who are working on creating hardware that will cryptographically sign recorded data as it is captured. When used in real-time, connection to the Internet where the cryptographic signatures can be submitted immediately and included in a public block chain, the signatures + block chain inclusion creates an unforgable timestamp proof of existence. So at a live event, if you have multiple people recording video at various angles, using cryptographically secure digital signatures submitted to a public block chain in real-time, The rock, unedited recorded data becomes a powerful proof of what was actually said or what actually happened at the live event. Imagine if political opponents are using this technology and both are recording the same event. So long as one has access to the raw, unedited, signed data signed and in the block chain, it should be impossible for anyone to create fake variations of that event, without those forgeries being easily recognizable when compared to the authenticated data.

While there are some things that may lead to some degree of worry, there are other rising technologies that may mitigate some of the potential issues.

-Aaron
St. George, Utah

Jul. 28 2017 12:07 PM
Jim G from Omaha

This is another example of movies predicting the future. If you watch the old Arnold Schwarzenegger movie running man this is how his character was setup in the movie.

Jul. 28 2017 11:08 AM
Neo from Atlanta GA

Also.. the use of VR physics lectures by Einstein cracked me up !

Back when ARPANet was being created, apparently someone asked the scientists what use would they see for internet. And the scientists said some thing open exchange of ideas on meaningful ideas like art , science etc.

I bet they never saw "I can has cheese burger" coming. :)

I am willing to bet this technology will have a lot more "fun" ;) applications than VR lectures by Einstein. (because a. I like feinman's lectures better and
b. per my 3 year old, talking tomcat and chipmunk songs are more fun than einstein's lectures.. trust me.. i tried.. !)

Jul. 28 2017 10:06 AM
Neo from Atlanta GA

Here's an analogy :

Imagine an artisan in the time of Michaelangelo. Lets say he creates a pigment with a brilliant shade of blue. If someone asked him what use do you envision for this blue, he might have said something like - "Great artists like Michaelangelo are painting up the ceiling of sistine chapel. Trust me, my shade is gonna be so good, people are gonna talk about it for the next 20 years!"

That did happen, for sure, but, it is very difficult for this artisan to understand the profound implications of this new pigment. He might not have imagined Michaelangelo's work would be talked about for a few hundred years. He cannot have imagined its use by masters of painting. Nor could he have imagined a 3 year old in 21st century using HIS pigment to finger paint a stick figure family at her daycare.

Similarly, understanding consequences of fundamental technology always ends up falling short. We typically ask a VERY narrow user base, with very narrow context to contemplate the extent of future uses.

In this article, we explored the uses of this technology with a few, very-well-qualified-but-still-only few, people out of 7 billion.

Question is
1. what uses could come about if this new tool reaches billions of people, many years down the line?
2. How many and what positive consequences could there be ?
3. How many and what negative consequences could there be?
4. Can we figure out if positives would win out or would negatives win out?

Also.. keep in mind.. we human beings (including me) have a "loss aversion bias". We fear a minor loss more than we a wish for a major win. In this context, even if this technology has a positive consequence, we are wired to fear the downsides more?

So - should the fear of unknown downside(s) out weight exploring and creating new technologies (or pigments? )

May be we should let this play out ?

Jul. 28 2017 09:59 AM
Youssef

Oh PLUS the audio replacements of Obama's voice were just so freaking obvious. You could hear different audio quality on the replacements.

Jul. 28 2017 09:52 AM
Youssef

I just watched the Obama video recreation and honestly it's so shitty (sorry Simon) I don't know why you guys are freaking out about it.

What was more convincing was the face2face facial reenactment examples of Bush & Putin I found online. But this example of Obama screamed of manipulation. His mouth was not moving naturally at all, it was being stretched while open, & it looked like it was being puppeteered and i guarantee you if I saw this video just on youtube while browsing I'm pretty sure I would spot it.

Jul. 28 2017 09:50 AM
Josh Marthy from Albany, NY

The future is now

Jul. 28 2017 09:01 AM
Flip Schrameijer from Amsterdam, THe Netherlands

I totally agree with you this is deeply troubling. I'm amazed and - again - very troubled by the total lack of concern by this woman maker (didn't catch her name). Indeed, as you say , "is the world ready for this"?

What frightens me,for one thing, is the possibility that Trump can now deny anything he said on the campaign trail, such as humiliating the impaired journalist: fake news.

I hope indeed methods are being developed which prove beyond any doubt and quickly to tell fakery from reality.

Jul. 28 2017 08:41 AM

Leave a Comment

Email addresses are required but never displayed.

Supported by

Feeds