Showing posts with label perception. Show all posts
Showing posts with label perception. Show all posts

Sunday, August 2, 2015

The creepy-crawlies: spiders and itching

Trigger warning: spiders; itching; psychology.

The creepy-crawlies. The heeby-jeebies. Makes my skin crawl. Many of us have these reactions when we see spiders:


Logically we know that spiders are unlikely to hurt us - in fact we can kill them quite easily - and that spiders are important parts of the ecosystem. Illogically we fear them and may be gripped with an illusory sensation that they are crawling upon us...producing itching that is very real. Why is this?

Some suggest that we may be evolutionarily wired to notice spiders and perceive them as a threat. New and German (2015) asked undergraduate students to perform a perceptual task: indicate if two crossed lines were equal or unequal in size each time they were briefly flashed on a screen. For most of the trials all that appeared on the screen were two lines in a cross shape, but for one trial another image was sneaked in next to the cross. This other image was either: a spider or a spider-like shape; a housefly or a fly-like shape; a hypodermic needle or a shape that was similar to a hypodermic needle. This is an "inattentional blindness" test because the participants have not been instructed to look for these things: their attention has only been directed to the lines so they should be "blind" or not notice other things. If participants do notice the other images, we can know that those images really stood out to grab their attention.

New and German found that the participants were much more likely to notice the spider or spider-like shapes than the other options. This was true even of participants who reported low fear of spiders and participants who reported high fear of needles. The authors believe that this represents an evolutionary wariness of spiders that has been passed down from our earliest ancestors in Africa where venomous spiders likely posed daily threats. In our modern world we are more likely to feel pain from an injection than truly be at risk from a dangerous spider bite - yet, spiders grab our attention more than needles.

If spiders grab our attention and make our skin crawl, when are we most likely to start scratching that imaginary itch? Llyod, Hall, Hall, and McGlone (2012) showed female undergraduates images related to itching, such as insects and skin rashes, and itch-neutral images, such as flying birds. As predicted, when participants viewed the itch-related images their self-reported levels of itchiness were significantly higher.

In addition to the self-reports, Lloyd et al. observed the participants' own scratching behavior and noted any scratching movement that lasted for more than one second. Out of all of the itch-related images, the photographs that included a person scratching him- or herself were associated with the most scratching from the participants. The authors suggest that feeling itch may be automatic but a scratching response may be triggered by a social situation that activates mirror neurons: brain cells that react the same to doing or watching a behavior.

When we put this all together, we know that your eyes are likely to be drawn to this guy if he is hanging out in your living room:


If your skin starts to crawl, you are normal: seeing insects and spiders makes us feel itchy. But you are most likely to scratch if you see your friends start scratching.

Further reading:

The New and German (2015) article is published in the journal Evolution and Human Behavior but is available online in draft form; the Lloyd et al. (2012) article can be accessed through your local college library.

Some individuals suffer from chronic, not just creepy-crawlies-induced itching. You can read more about recent findings related to this debilitating condition in this National Institute of Health article.

If you are thoroughly creeped out and itchy from reading this post, here is a totally unrelated video of a baby laughing to cleanse your palate. You are welcome.


Monday, July 13, 2015

Apparently you can gossip about babies and the elderly.

We have all been in a noisy situation - the school cafeteria, a party, waiting for a meeting to start - when in the middle of your conversation with one person, you suddenly hear your name mentioned by somebody across the room. Maybe you looked like this when it happened:


The first part of this scenario describes the "cocktail party effect": it is really amazing that we can use selective listening to tune out background voices and concentrate on our conversation. This effect was first studied by Cherry (1953) who found that this sort of listening is easier when the voices appear to be located in different places like you would experience in social situations. In the laboratory this could be mimicked by a dichotic listening task: while wearing headphones, the background conversation would be heard in one ear while the participant is asked to focus on a voice heard in their other ear. If, instead, all voices come from the same location this becomes much more difficult. The laboratory equivalent would be having all voices streamed into both sides of headphones.

The second part of the scenario and this week's meme illustrate later work done by Moray in 1959. Usually in dichotic listening tasks the participants are pretty good at repeating what they are instructed to listen to and are barely aware of what is being streamed into the other ear. Moray found an exception to this: many people are able to notice when their names are mentioned in the background speech that they have been instructed to ignore. Later research suggests that this is particularly true of people who are easily distracted and have poor working memories.

Some modern studies show us when the response to our name may develop and when this response may decline. To start, Newman (2005) performed a series of studies to determine the age at which babies start to pick out their names from background speech. Babies sat on their parents' laps and listened to a recording of three women speaking: throughout the entire recording one voice read passages from books; at the same time babies would hear a second voice saying their individual names alternating with a third voice saying similar names. This recording came out of a loudspeaker next to a red light that would go on when names were mentioned: so the red light served as the single "source" of the voices. Newman could tell if the babies were noticing a name if they looked at the red light when it was layered over the book passage.

She found that babies as young as five months showed some ability to pick out their names. This was because they looked at the light slightly longer when their names were overlaid, but only when their names were 10 decibels louder than the words from the book passage. Newman then demonstrated that it is around age one that young children no longer require their names to be that much louder than background speech to notice them. So this ability appears to develop in the first year of life, then is further honed up to adult ability.

Switching to the other end of the lifespan, Naveh-Benjamin, Maddox, Kilb, Thomas, Fine, Chen, and Cowan (2014) performed a series of studies comparing young adults to senior citizens on a dichotic listening task. Both age-groups were instructed to listen to the words streamed into one ear and to ignore that background words that were streamed into the other ear. (Although I doubt that any of them looked as cool wearing their headphones as Ruth Flowers did DJ'ing at age 72).


Naveh-Benjamin et al. wondered if older adults, who usually have poorer working memories due to aging, would perform like young adults who have poor working memories? Specifically: would they be more likely to notice when their names are mentioned in background speech that they are told to ignore? The results were surprising: in several variations of the study senior citizens were consistently less likely than poor-memory young adults to notice their names in these background words. In fact, they noticed their names less than even the high-memory young adults! This trend was not influenced by the older participants' individual working memory abilities, which ear the background speech was streamed into, or how quickly any of the words were paced.

Even more striking was the finding that the seniors showed very little notice of their names even when the task was changed so that they were instructed to listen to the recording that contained their names and ignore the speech streaming into the other ear! No wonder the title of this research article is, "Older adults do not notice their names..."!

Taken together, these studies suggest that our tendency to tune out or tune in is related to a number of cognitive processes. Newman suggested that infants may develop these abilities as their understanding of speech as a tool and their ability to selectively listen increase. Naveh-Benjamin et al. emphasized that they cannot determine what caused their results but they wagered that dichotic listening tasks require more brain power from older adults to concentrate on one thing and ignore another. So this extra "effort" may have produced their results. Clearly further research is required.

On a lighter note, if you are at a noisy party and gossiping about a person who is across the room - you are probably not going to get caught if that person is a baby or a senior citizen! But hopefully you will have more tact than Jerry and Elaine on "Seinfeld".


Further Reading:

The Newman (2005) and the Naveh-Benjamin et al. (2014) articles can be accessed at your local college library.

Here is a great article on the neuroscience behind the cocktail party effect by Golumbic et al. (2014).

 A "Psychology Today" blog post by Liane Davey on ending the negative gossip habit.

Sunday, February 22, 2015

Aristotle, Waterfalls, VanGogh...oh my!

I wanted to write a post about the psychology of Perception and look what ended up in my inbox!  As long as you do not have a medical reason to skip this demonstration - follow the directions in the video by staring at the center of the swirling circle for 30 seconds until it changes to an image of Vincent VanGogh's "Starry Night" painting: 


If the optical illusion worked correctly, the image of "Starry Night" should have appeared as it, too, was whirling and swirling even though it was not moving at all (for you doubters who think the painting was animated on Youtube, check out this version)!

This illusion happens thanks to sensory adaptation - what happens any time you get used to seeing something, hearing something, smelling something, etc.  The type of adaptation you experienced is called Rapid Motion Adaptation or RMA: your brain got used to perceiving motion and you could see evidence of that when the scene switched to something that was still.  The illusion is sometimes described as the Motion After Effect.

This after-effect was mentioned as early as 350 B.C. by the philosopher, Aristotle.  In "Parva Naturalia" he wrote, "...when persons turn away from looking at objects in motion, e.g. rivers, and especially those which flow very rapidly, they find that the visual stimulations still present themselves, for the things really at rest are then seen moving."

An interesting twist of the RMA is that the after-effect motion seems to move in the opposite direction than the actual movement that is first seen.  Record of this "Waterfall Effect" is found in Robert Addams' 1834 description of how the rocks next to a waterfall began to appear as if they were moving upwards after he had stared at the water plunging downwards.

In 2011, Glasser, Tsui, Pack, and Tadin did a series of studies on the RMA and the after-effect illusion.  Although their sample sizes were small, their findings suggest that the RMA can be triggered even faster than had previously been suspected:  it can be provoked after less than one second of watching movement. So even if the person was not aware of the direction of movement that they briefly saw, they were usually correct in reporting an after-effect motion in the opposite direction.

Using two macaques (monkeys), the researchers were also able to pinpoint neurons in the Medial (Middle) Temporal region of the brain that were activated when the RMA was triggered.  This region is related to the processing of perceptual information.  Even after a very brief view of an image in motion, the neuronal activity indicated that the monkeys' brains were responding to the follow up still images as if they were also moving.  The suggestion is that similar activity occurs in our brains, too.

So even though we might call "The Starry Night" demonstration an "Optical Illusion," it is not so much our eyes playing tricks on us - instead it is evidence that our brains are adapting to the world around us.

Further Reading:

A really wonderful website on the history of the Motion After Effect.  It includes larger sections from Aristotle and Addams, plus other early works that describe this effect.

Here is the Glasser et al. (2011) article if you would like to read in detail about their studies.  I especially enjoyed the part about plaid stimuli - had never thought of plaid in relation to perception!


photos-public-domain.com