Sunday, April 19, 2015

Google and Overconfident Idiots

Since starting this blog I have spent a lot of time on psychology-related social media (find us on Twitter, Facebook, and Pinterest!).  So just like a meme can remind me of psychology, psychological research can remind me of a meme.  Lately there has been a lot of buzz over a collection of studies by Fisher, Goddu, and Keil (2015).  For example, the Wall Street Journal's website proclaimed, "How the Internet Makes You Think You Are Smart."  Similar headlines made me remember this meme starring Overly Suave IT Guy:


If Overly Suave IT Guy is not citing his sources, that is bad and you know all about that from this previous post.  But is he really an idiot?  There are several ways to be an "idiot":

1) Not knowing the answer.  But this does not fit because Google supplied him with the answers.

2) Not knowing how to find the answer.  But again, he knew about Google.

3) Believing you are more knowledgeable than you really are because you know how to Google.  Now this sounds a lot like that Wall Street Journal tagline...so what is that based on?

Fisher et al. performed a series of studies with the ultimate goal of assessing if the act of doing Internet searches could prime us to feel more confident about our ability to give good answers.  Their paper is long, a bit convoluted, and it seems to me that many journalists did not slog through reading the whole thing.  If they had they would have reported that:

Studies 1a and 1b provided the preliminary basis for study 1c.  In 1c participants were given a pre-test that asked them to rate from 1-7 how well they think that they could answer detailed questions on a number of topics.  Then half of the participants were asked a trivia question and given detailed instructions for how to search online for the answer; these instructions led them to a specific website.  The other half of the participants were asked the same trivia question, and after a 12.6 second break to emulate the time that it would take to look up the information, they were given the exact transcript of the website to read.  Both groups rated themselves on their ability to answer the trivia question and then took a post-test that repeated the pre-test ratings activity.  

Although the two groups did not differ on the pre-test, the group that looked up the information online rated themselves higher on their ability to answer the trivia question and give good answers on the post-test topics.  This effect remained even when the post-test instructions clarified that their ratings should assume that they would not use outside sources to come up with these answers (Study 2b).

In the post-test part of Study 2a the participants were told that better answers to questions were related to more brain activity.  Instead of rating themselves like before, participants from both groups (look up online vs no look up) had to choose which fMRI brain scans would best represent their ability to answer questions from each of the pre-test topics.  The group that looked up the information online was more likely to choose images of very active brains:  again the act of looking up information online was associated with a stronger belief in their ability to give good answers.

Study 3 replaced these fMRI images with another rating task:  participants from both groups had to rate their abilities to give good answers for questions about themselves instead of the pre-test topics.  Some of these questions were easy and others were mind-bendingly difficult ones like, "How did your learning style in your high school freshman year math class affect your interest in miniature golf" (p. 14).  There was no statistical difference between the groups on their assumed abilities to answer any of these questions, so the authors suggested that searching online does not give people global over-confidence.


Studies 4a, 4b, and 4c only assessed participants who looked up answers online but it varied how they could access the information or what they could find.  For example, in 4a participants who actively searched online (using lesser known search engines like duckduckgo.com) rated their ability to answer questions higher than participants who were given direct hyperlinks to the answers.  In the remaining two studies, no statistical difference was found in the ability ratings between participants who were led to find answers easily vs. with difficulty on Google; or in the ability ratings between participants who could retrieve few vs. no answers based on a diabolical Google filter.  Thus the authors concluded that it is active Internet searching that is giving rise to confidence in ones answers, regardless of how fruitful those recent searches have been.  The authors also compared results from these three studies to the data from participants who were not able to look up information in the previous studies:  as before, the act of looking up information online predicted higher confidence in ones ability to give good answers.

See...it's rather exhausting.  But does it convince us that Overly Suave IT Guy really is an idiot but thinks he is bright because he Googles?  Certainly these are statistical differences that are unlikely to be due to chance, but are these practical differences that we could see in our own lives?  

In Fisher et al. the self-ratings were always on a scale of 1 (in my words, "I predict that my answer will suck.") to 7 (in my words, "I predict that my answer will be perfectly accurate.").  For studies 1-3, the average self-ratings from both groups ranged from 3.07 to 3.94.  So all participants were estimating their ability to give answers to post-test topics to be very average; the statistical difference shows us that the participants who had just looked up information online were on the higher side of average.

Another twist is that along with high sample sizes that increase the likelihood of such statistical differences, the participants may not represent average Americans.  Ironically, as I read this article I had to do some Googling myself: the participants were all members of Amazon Mechanical Turk and I did not know what that is.  It turns out to be a crowd-sourcing service run through Amazon.com.  If you have an Amazon account you can sign up to be a Turker (not to be confused with a Twerker) and complete surveys or do other tasks online to give feedback to corporations and researchers, often about Internet content.

Further Googling led me to past research on the Turkers.  They tend to be paid about 75 cents (American) for 30 minutes of work, although average payment can range from 1 cent to $5 dollars that are applied toward their Amazon accounts.  More than half of the Turkers come from the U.S. (or at least their IP addresses indicate this) and those American Turkers tend to be young adults, the majority are female, most have college educations, and are likely to participate "to kill time" (on average, a few hours a day).  They report that the majority of their Turking assignments have to do with performing Internet searches.

The Fisher et al. article does not contain these details nor does it clarify how much participants were compensated for each study.  Although participation was limited to Turkers with U.S. IP addresses, curiously, more men (1,004) than women (704) participated in these studies (but can sex be verified in an online environment anyway?).  The authors do acknowledge that Turkers may have a lot of experience performing Internet searches, but there is no mention that these adults are unlikely to represent "the average American" any better than college students, the most common source of participants.

So Unusually Suave IT Guy may or may not deserve our scorn, but the average Googler does not have to worry.  Your Internet searches may slightly inflate your confidence but they will not turn you into an over-confident idiot.  Moreover, other research on the Hive Mind - or in psychologese Transactive Memory - offers a more positive spin on how Google can function as our collective back-up brain.  For example, listen to Dr. Betsy Sparrow discuss research she did in 2011 with Jenny Liu and Daniel Wegner:



Further Reading:

A version of the Fisher et al. (2015) article can be accessed on the American Psychological Association's website and the published journal article can be accessed through your local college library.

This version of the Sparrow et al. (2011) article can be accessed for personal use.

A recent podcast from "On Being" titled "Online Reflections of our Offline Lives."  Listen to Danah Boyd (of Microsoft and Data & Society) wax philosophical on the role of technology in our daily lives.  It might not be as scary as we are led to believe.

No comments:

Post a Comment