Analysis

Confirmed: Google is trying to win elections for Hillary

Transcribed by Joe Delaplaine

The following is a rush transcript of the Sept. 12 episode of Loud & Clear with Brian Becker on Radio Sputnik. Copy may not be in its final form. Click here to listen to the episode.

Brian Becker (BB): Today we bring you a very special edition of  “Loud and Clear” with an exclusive, breaking story about Google’s manipulation of search suggestions, and how it could pose a serious threat to democracy.

A Sputnik exclusive reveals that a team of psychological researchers led by Dr. Robert Epstein, an author and research psychologist at the American Institute of Behavioral Research and Technology found that this may be the case.

Previously, Dr. Robert Epstein unmasked that Google has the power to influence an election or even determine the winner [using] the “Search Engine Manipulation Effect”.

Now Dr. Epstein has published new research pointing to the existence of a new phenomenon, “Search Suggestion Effect”, he joins us for the full hour, welcome back, Dr. Epstein

Dr. Robert Epstein (RE): It’s my pleasure, Brian.

BB: Thank you for joining us, and thank you for your work. Walk us through this new research that you published today. Our listeners may remember you from two previous interviews you did on the show where you talked about the power of Google. What have you uncovered now?

RE: Well, it’s something new and something I was quite surprised to find. You may recall that back in June an organization called “SourceFed” released a video in which they claimed Google search suggestions –which Google calls, Autocomplete”– that Google’s search suggestions were biased in favor of Hillary Clinton.

And the video was pretty good. It was just a guy talking at you, but it was very powerful and so it actually went viral, it’s had more than a million views on YouTube and more than 25 million views on Facebook. So I would call that “viral” for something like this, especially.

So then, Google actually communicated with some journalists, denied that their search suggestions favor any candidate and they said that SourceFed couldn’t find negative things about Hillary Clinton because it’s their policy they said to suppress negative suggestion for everybody. “For everybody”, that’s what they said.

So, this Summer, my staff and I, or eight people who work for me, started looking into this to see if there was anything to SourceFed’s claims.

And then we also, along the way, got some ideas and conducted a new experiment. And that is the new experiment I’ve written about in the new article that just came out.

BB: Okay, so talk about what was the results showed, specifically, if you would.

RE: Well, sure, first of all –in general– we did confirm SourceFed’s claims. Not completely, I should point that out,  because you can, at times, get negative “search suggestions” related to Hillary Clinton from Google. It’s hard to do, but you can do it. And, at times, you can get a neutral or even positive suggestions regarding Donald Trump, so you can do that too.

So, SourceFed’s claims were overstated. So that’s one part of what we learned, but then we got curious because the question  [then became] –and it was never addressed by SourceFed or anywhere else– so the question was, “Why would Google suppress negative suggestions for one candidate?” That became the question.

So we conducted an experiment with three hundred people from forty-four U.S. states, a pretty diverse group of people. And in this experiment, we asked people to tell us which search suggestion they’d click on, given various sets of “search suggestions”. So we showed them different sets of search suggestions. In each case they could click on four different suggestions, or they could make up their own “search term”. So there’s always five possibilities for them.
And two of these search suggestions we showed them were negative.

So the only two negative search suggestions out of basically twenty suggestions in total, that they saw. And by the way, all [of these suggestions] had to do with the Vice-Presidential candidates, Mike Pence, who is the Republican Vice-Presidential candidate and Tim Payne is the Democratic V.P. candidate.

And what we found was pretty shocking. If people were responding just at random to all these things, then the chance of them clicking on the two negative search suggestions would be twenty-per-cent.

And the first big finding we found was that people were clicking on those negative items about twice as often as one would expect by chance. In other words, those negative items were drawing forty-per-cent of clicks. So that was one outcome that was pretty dramatic and pretty clear.

Another outcome related to this is that, if you compare the number of clicks on the negative items to the number of clicks in what we call “Matched Control” items, which were just neutral items, then in general, people were clicking on the negative ones five times as often as on the neutral “control” items.

And if you look at one group of people in particular, “undecided eligible voters” –and those are the people, by the way, who decide close elections– if you just look at those people, “undecided eligible voters”, they were clicking on the negative search suggestions fifteen times as often as they clicked on the neutral control items. Fifteen times as often.

So, what this is saying –and there’s a lot more I can tell you about it– but what this is saying, in a nutshell, is that Search Suggestions can be used very easily to manipulate people’s’ opinions about candidates –or anything else. All you have to do is, if you support one candidate, you suppress negative search suggestions for that candidate. That ends up sending a lot of people to web pages which view that candidate very positively, but for the opposing candidate, you don’t suppress negative search suggestions.

BB: Let’s give some examples because I was just looking at your article, the article that’s being published today. When you type in the search, “Hillary Clinton is…[space]”, and you do that in Google, and you do that in Yahoo, and you do that in Bing –three different technology companies offering internet searches. If you say, “Hillary Clinton is…”, the first item that comes up in Google, according to your article is, “Hillary Clinton is winning”. The second [Google search result article] is, “Hillary Clinton is awesome”.

And then if you do it in Yahoo, [Yahoo’s search results display as], “Hillary Clinton is a liar”, or, “Hillary Clinton is a criminal”, or “Hillary Clinton is evil”, or “Hillary Clinton is a crook”.

One, there’s a lot more search suggestions that come up in Yahoo and Bing, but they’re completely different. One [company displays] “Hillary Clinton is winning, Hillary Clinton is awesome”, and the other [company’s result] is “Hillary Clinton is a liar”.

And in Bing, it’s “Hillary Clinton is a filthy liar”, is number one. “Hillary Clinton is a murderess”, “Hillary Clinton, is she evil?”.

I mean what’s the algorithm and is it because there are terms that are “trending”? Is it because these are the most searched terms that would come to the fore, say, for Yahoo and Bing? Is your argument,  that those are trending topics and thus they would come first if they were just a pure reflection of trending questions?

And thus, Google’s saying when you type in “Hillary Clinton is [space]”, it comes up “Hillary Clinton is winning” and “Hillary Clinton is awesome”, instead of that she’s a liar. Is that the proof –or some proof– that there’s a conscious manipulation on the part of Google?

RE: Well, no, the real proof comes when you look up the frequency of search terms on “Google Trends”, because Google Trends actually shows you the frequency of search terms, that’s where the proof is.

Because if you look at “Hillary is a liar” and you look up “Hillary is awesome” on Google Trends, and this is in my new article, I can actually show you the “screen prints”.

You find that on Google, “Hillary is a liar” is very frequently searched phrase, and virtually no one ever searches on “Hillary Clinton is awesome”.

And yet when Google is showing you search suggestions, [Google] does not show you, “Hillary is a liar”, [instead Google] shows you “Hillary Clinton is awesome”.

So, in other words, Bing and Yahoo and DuckDuckGo [3 search engine companies] are all showing you search suggestions based on what people are searching for. Period. That’s all they’re doing. You can actually look at Google Trends and you can find out for yourself.

[However,] Google is not doing that, it is not showing you search suggestions based on what other people are searching for. [Google] is showing you its own special, little, edited, censored lists of search suggestions. And it is doing it –as we now discovered– in a way that suppresses negative search suggestions for one candidate, but not for the opposing candidate. It suppresses negatives for candidates they’re supporting. And what this ends up doing is sending millions of people to positive web pages –bloating web pages for candidates it supports– and it sends millions of people to negative web pages –critical web pages– for the candidate they oppose. It’s a brilliant and very simple way of manipulating people, without anyone knowing they’re being manipulated.

BB: I think most people have some sense of trust that the algorithm has neutrality to it. I mean, there is a generalized trust that when people are searching for something, there’s not a conscious and injurious sort of manipulation going on. Do you think that’s true?

RE: Absolutely, and in fact there is a company called “Edelman” that actually measures the trust people have in various media sources. The Annual Edelman survey shows that people trust an online source, like Google, much more than they trust any other media source. Because with other media sources there are competitors, the “human element” is very obvious. Like if you’re watching CNN versus Fox News –or any other media source– the “human element” is very obvious [meaning,] their political slant is pretty obvious. But people trust online sources like Google to a much higher degree because the human element is invisible. And people think it’s all output from computer algorithms and is therefore completely objective and impartial.

And, of course, what we are finding [and] what SourceFed found is that that is absolutely not true. This trust people have in this source is based on an illusion. And it’s very very dangerous. It’s a very dangerous “spell” to be weaving over the public.

BB: Dr. Epstein, in the article that’s just being published today on Sputnik, as an exclusive, based on your research, for an article that you have written, one of the examples that you use, and I’m going to quote from your article.

First, on August 6th, when you type [in] the suggestion, “When is the election?”, that’s the question. You were shown the following [search result on] November 8th, 2016, “The United States presidential election of 2016, [a] constitutionally prescribed to occur, blah blah blah.” But next to it is a big picture of a smiling Hillary [Clinton]! There aren’t two candidates there’s just one [shown].

I mean, how could it be anything other than a manipulation? It could have easily been both candidates [shown in the search result]. They were the two leading candidates, that’s the Republican and Democrat. We wouldn’t expect the third parties to make it because of “the negation of democracy” on that front.

But that’s what that shows: when you type in [using] Google, “When is the election?”, you get a picture of Hillary Clinton smiling!

RE: Well, yes, and that is pretty blatant. That’s blatant and obvious, they could have printed two pictures or four pictures just as easily as they printed one.

So that’s blatant, obvious and sloppy –really, if you think about it– on their part!

And, believe it or not, I’m not concerned about a mistake they make that’s so blatant. I’m much more concerned about search rankings because there, what’s happening, is invisible to people. Now, of course, I’m also concerned about search “suggestions” because what’s happening there is invisible to people. But, yeah, that image where you see Hillary Clinton’s picture all by itself, when all you’re doing is asking Google, “When is the election?”, that’s obvious, blatant support for one candidate.

I give another example from last year, and this went on for several months: if you typed into Google, “Who is the next president?” the answer you would get was –and I actually show a screen print to prove this– the answer you would get is, “Hillary Clinton”, is the next president.

BB: What’s Google’s motivation? Talk about Google’s support for Hillary Clinton. It’s not hidden. I mean this is hidden. The fact that they’re manipulating search suggestions and auto-complete suggestions in a way so that only favorable news about Hillary Clinton comes up. Or not “only”, but mainly favorable. While, for the other candidates [Google’s search suggestions and auto-complete are] both negative and positive, but certainly not skewed in this direction.

What’s Google’s motivation?

RE: Well, a lot has been written about Google’s support for Hillary Clinton, and this goes back to the beginning of her campaign. So you have to say, “Okay, why would they support Hillary Clinton?”,  as opposed to Donald Trump, or anyone who opposed her. For example, Bernie Sanders.

You’d have to look at the culture in Silicon Valley. You’d have to look at where their employees come from. They come from mostly the top schools in the North-East, which tend to have a Liberal Democratic [Party] slant. So that’s part of it.

But part of it too, has to do with this “revolving door” which has been in place since President Obama came into office seven years ago.

Related Articles

Back to top button