Sunday, October 25, 2015

Revenge of the preteen nerds

Many people get an education because they want good careers that earn good money. Being studious may not make you cool but it should set you up to meet that goal; students who party instead of studying may not be so successful:


This quotation is actually from Bill Gates, even if Actual Advice Mallard seems to be stating it here. Did Bill Gates get it right? Are today's nerdy kids likely to be well-employed and financially stable as adults?

Spengler, Brunner, Damian, Lüdtke, Martin, and Roberts (2015) examined data from a large-scale, longitudinal study done in Luxembourg, a small country in Europe. In 1968 the MAGRIP study collected data on Luxembourgish youngsters around age 12. This included measures of the children's IQs, their parents' levels of wealth and education (SES), ratings from teachers on how studious the children appeared to be, and self-reports about feelings of inferiority, being a responsible student, defying their parents' rules, and talking back to their parents.

In 2008, the second wave of MAGRIP followed up with 745 of these same individuals when they were in their 40s. At that time the participants were asked about their educational attainment (years of education after elementary school), their current or most recent jobs, and their yearly incomes. In their analysis, Spengler et al. wanted to know what traits at age 12 were associated with greater educational attainment, occupational success (as measured by prestige and social class), and higher incomes in middle age.

In line with past research, the participants who had higher IQs and higher SES families when they were 12 attained higher amounts of education by middle age. However, even when those two variables were accounted for, three other attributes predicted educational attainment: feeling that you are not as good as others; studiousness; talking back to parents and not always following the rules. Two of these are not surprising: self-reported feelings inferiority at age 12 were related to lower levels of education by one's 40s; while teachers' higher ratings of studiousness at age 12 predicted higher levels of education by middle age. Interestingly, reporting at age 12 that you talk back to and do not always obey your parents was also related to higher educational attainment in adulthood.

Likewise, more occupational success in middle age was predicted by having a higher IQ and coming from a higher SES family at age 12. Even when those two variables were held steady, seeing yourself as a responsible student at age 12 predicted occupational success in your 40s. Higher teacher ratings of studiousness during childhood also predicted occupational success, however this relationship played out through the amount of education that the participants experienced. Spengler et al. clarified that being seen as studious by your teachers in childhood suggests that you have traits that might lead you to choose more years of and more challenging levels of education. In fact, educational attainment was the best predictor of occupational success.

The analysis revealed an unexpected finding related to personal income. The participants who at age 12 admitted that they talked back to their parents and did not always follow their rules were more likely to have higher incomes during their 40s! This was true even when childhood IQ, family SES, and lifetime educational attainment were taken into account. Spengler et al. caution us that this finding needs to be replicated in future research - so this is not a green light for preteens around the globe to sass back at their parents. They also offer two explanations: these individuals may be more likely to argue to get higher wages; it is also possible that these individuals broke the rules as adults to get this higher income.

I would add that it is also possible that they were raised by good, Authoritative parents. These parents express love to their children and raise them with the correct amount of fair discipline; they are also willing to discuss household rules with their children, conversations which may start off with "talking back." Authoritative parents want their kids to be able to reason about their behavior choices instead of simply showing obedience to authority. So, it would stand to reason that their children may not always follow even their own rules to the letter. Authoritative parenting also prepares children for white collar jobs that require independent decision making and less emphasis on following orders; these jobs may also offer higher levels of pay.

Another possibility that I can imagine is that some of these individuals are Gifted. In general, higher IQs are related to higher incomes and Gifted children have IQs that are at least twice as different as we would expect from an average child. Parents of Gifted children often joke about them being "little lawyers," because they can be argumentative even with adults in authority. So Spengler et al.'s surprising finding may simply reflect a variable that is related to unusually high intelligence.

The authors admit that the predictive powers of pre-teen Luxembourgish children's behavior in 1968 may not generalize: it may be that educational attainment, occupational success, and income level are predicted by different traits today and these traits may vary from culture to culture. On the other hand, the Spengler et al. analysis demonstrated that personal factors like being a responsible student, coming across to teachers as being studious, and how you relate to people in positions of authority can predict achievement beyond their relationships with general intelligence and social class. In that case, being a studious (and perhaps argumentative) nerd may be enough: you don't have to be a genius like Bill Gates or have a wealthy parent like Bill Gates to become an adult with a good job and a good income.

Further Reading:

The Spengler et al. (2015) article can be accessed through your local college library.

Clinical Psychologist Kelly Flanagan blogged for The Huffington Post about why he thinks "...Every Kid Should Talk Back to Their Parents."

Serious students may have a good chance of landing a job found in U.S. News and World Report's list of "The 100 Best Jobs" of 2015.

BONUS: Watch an amusing promotional video titled "Is it true what they say about... Luxembourg?" made by the Luxembourg National Tourism Board. Spoiler Alert: no volcanoes; yes happy dogs.




Sunday, October 18, 2015

Phoning it in on exams

Even in a crowded lecture hall your professors know when you are using your smartphone during class.


Some instructors ignore this, some punish this, but most of them do not like it.


Although students might think that class rules limiting or banning the use of mobile phones are cruel, in truth they come from a good place: most faculty believe that your phones will hurt your grades by stealing your attention. A basic idea from cognitive psychology is that you have to pay attention to something if you want to commit it to memory or be able to use the idea effectively.


A study done by Bjornsen and Archer (2015) further supports this argument. These professors spent two semesters in their psychology classes surveying a total of 218 college students about their in-class cell phone use. At the end of each class the students answered a questionnaire about the number of times that they looked at their phones for: social networking (texting, email, social network apps like Facebook and Instagram); getting information (Googling, checking the online syllabus, checking a website related to class); organizing their lives (personal calendar); or playing a game. The students were assured that their answers would not influence how the professor graded their work, so the assumption is that these self-reports were honest.

The authors compared the amount of each type of mobile phone use to test scores in these classes. They found that both social media use and gaming during class were associated with lower test scores, with playing games being much worse than using social media. Because social media use was more common than playing games, Bjornsen and Archer looked at these data in a more detailed way. They divided the students who used social media during class into high (5x per class), medium (2.4x per class), and low (1x per class) then looked at average test scores for each group. Across five exams the high in-class social media users scored an average of 74% and the low social media users scored an average of a 76%. These scores are close but they could be the difference between getting a C or a C+ in a class.

However, one of these effects changed when Bjornsen and Archer controlled for overall GPAs (grade point averages). As you might guess the students with the worst GPAs were more likely to play games in class, so the relationship between game playing and low test scores was likely influenced by this third variable: it disappeared once GPA was factored in. On the other hand, even when overall GPA was included as a factor, the relationship between in-class social media use and lower exam scores remained significant.

This implies that even students who have higher GPAs still score lower on exams when they use social media during those classes! From past research on college students we know that more than 90% admit to using cellphones during class and that 77% do not believe that this would cause problems with their learning. Bjornsen and Archer's results suggest that students, especially students who do well in school, may not be aware that their tests scores might be a bit higher if they reduced or stopped using social media during class time.

Before you worry that this study will drive professors to ban cell phones, the authors suggest that this is not sensible given that today's university students are used to having their smartphones as constant companions. They cite past research about this population: college students spend 5-9 hours a day on their phones which includes: 95 minutes of texting; 95 minutes on social networking apps; 49 minutes emailing; 34 minutes surfing the Web. Based on this Bjornsen and Archer suggest some integration of phones into the classroom. For example, I will ask students to put away their phones if they are not participating or are distracting other in the class, but I don't mind if student take pictures of notes and images that I project onto the screen at the front of the room (even though there is good research behind the idea that taking notes through handwriting increases your memory and understanding of those notes).

 
Ultimately university students are adults who should weigh the benefits and costs of using their phones in class. For some, being connected to friends and family will be more important that scoring a few points higher on exams. For others, every point matters because their grades are crucial for getting into programs like Nursing, keeping scholarships, or having good GPAs for graduate school applications. If you aren't sure which group you fall into...ask Siri?

Further Reading:

The Bjornsen and Archer (2015) article can be accessed through your local college library.

There's an app for that! In 2014 college students Mitch Gardner and Rob Richardson created Pocket Points, an app that rewards students for not checking their phones during class. Find out if your university uses this system - if so you can get discounts at stores and restaurants. As if scoring 2% higher on average exam scores was not enough!

Cell or mobile phone addiction is addressed by Web MD. Find out the symptoms and suggestions for managing your smartphone time.

Sunday, October 11, 2015

When I'm 64

As a serious college student I remember feeling old before my time compared to some of my classmates who would blow off studying to socialize. Now as a middle-aged professor I don't "feel" as old as many of my peers (even though my daughter reminds me all of the time that I am "very old"). Apparently my experience is shared by Yoko Ono:


Chronological age, or how many years old you are, is just one way to think about aging. You can also consider: Biological age - how healthy you are; Social age - the habits you have and the roles that you take on; and Psychological age - how well you reason and think.

A recent study highlights how people's perceptions of their Biological and Social ages can influence their Subjective ages - or how old they feel. Stephan, Demulier, and Terracciano (2012) asked more than 1,000 French adults ages 18-91 to rate their physical health, to complete a version of the Big Five personality inventory, and to report their subjective ages.

The results support the idea that how old we feel is based on more than how old we are. These effects varied depending on the ages of the participants. The authors did not state the ranges of the age groups, but we would usually assume that their young adults were ages 18-39, middle-aged adults were 40-59, and older adults were 60-91.

A strong relationship between chronological age, health, and subjective age emerged. Middle aged and older adult participants who rated themselves as being in good health were more likely to say that they felt younger than they actually were. Stephan et al. clarify that on average, these middle aged adults felt 2 years younger and these older adults felt four years younger. These results support the importance of considering a person's Biological age.

When the authors controlled for health and demographic factors a relationship between chronological age, certain Big Five traits, and subjective age appeared. Because our personality traits often relate to our behaviors and activities, these results support the importance of assessing a person's Social age. For example, young adults who were high in Conscientiousness felt older than they really were. Conscientiousness implies being responsible and organized. Motivated and reliable young adults might feel like they are older than their peers because these are not characteristics that Western culture often associates with that age.

Middle aged and older adults who were high in Openness to Experience, and older adults who were high in Extraversion felt younger than they were. Openness to Experience has to do with engaging in diverse interests and being open to new ideas; Extraversion is associated with being out-going and dynamic. Both are traits that Western culture does not associate with middle and older age, so people with these traits are likely to feel younger than they really are as they age.

These results explain the experience that Yoko Ono and I share. When I was a young adult, my conscientious behavior did not match with my stereotype about my age: so I felt older. Now as a middle aged person who has broad interests and loves creativity, my self-perception again runs contrary to the stereotype about my age: so I feel younger. In some decades, I can predict that I will continue to feel younger because I am relatively high in Extraversion.

This research also raises a question about Western stereotypes about age. What does it mean that we view young adults as irresponsible, middle-agers and seniors as stuck in their ways, and senior citizens as being unsociable? If we can imagine a time that these negative assumptions are no longer part of our culture, it would have implications for subjective age. Instead of feeling older or younger than our chronological age we would simply be that old and recognize that at all ages individuals can differ on Big Five traits.

Further Reading:

A pre-publication version of the Stephan et al. (2012) article can be read here thanks to the National Institute of Health. The Psychology and Aging article can be accessed through your local college library.

A blog post from the AARP about research done by Rippon and Steptoe (2015): "Feeling Old vs Being Old." More support for Biological age!

Know an Extraverted senior? The social networking site Meet Up has groups around the world for outgoing older people who want to socialize!

Sunday, October 4, 2015

In defense of Kristen Stewart

There are many memes mocking Kristen Stewart for not smiling. This week's meme is one of them:

 
However, if you type her name into a Google Images search, and then do a second search for her Twilight co-star Robert Pattinson you will see that the two actors are both pictured smiling and not smiling. When I did this (albeit unscientific) search and compared the first 25 images for both actors, I found that both of them were shown smiling 11 times and not smiling 14 times. So why the shade for Kristen while Robert's image remains sparkling?

Part of it is due to gender expectations in our culture: women smile more than men and are punished more for not smiling. The most recent review of this phenomenon occurred in 2003 when LaFrance, Hecht, and Paluck published a meta-analysis of 162 studies. A meta-analysis allows researchers to statistically combine the results of many studies to determine if a difference exists and how big (or meaningful) that difference may be. The overall results from LaFrance et al. confirm that across these studies there is a small to moderate, or noticeable in the real world, effect of men smiling less. Fans are used to seeing women smiling so they notice and react poorly when Kristen Stewart bucks that gender expectation. Likewise, fans are used to men not smiling so the same facial expressions from Robert Pattinson go unnoticed as if they were invisible.

The researchers also used statistics to examine the different contingencies of the studies to see what is associated with this difference getting smaller or larger. In some cases it would come out smaller than the overall difference: in these situations women and men were closer to smiling at similar rates. Many of these effects were very small or even close to zero, which means that in the real world these contexts would likely be associated with very few observable differences between women and men:
*when people are not aware of being observed
*when they are in a group of four or more people (so the focus is not on one person)
*when they are not interacting with the people around them
*when they are very familiar with each other
*if they are comfortable because there is low pressure to impress
*when they are talking to a younger person or an older person
*when they are paired with a woman
*when they are interacting with somebody of the opposite sex
*when they share equal power with the other person
*if they are asked to play a role that requires caretaking, like taking care of a baby
*if they are forced to argue against the other person
*if people are from England
*if people are African-American
*if people are middle aged or senior citizens

In other cases this gender difference would come out larger than the overall difference: in these situations women were even more likely to smile than men. These range from moderate to almost high effects, which means that in the real world these contexts would likely be associated with actual observable differences between women and men:
*when people are alone (and presumably self-conscious about being observed)
*when people are alone but asked to imagine another person being with them
*when they are paired with a man
*if they are asked to persuade somebody
*if they have to reveal personal information about themselves
*if they are made to feel embarrassed
*if people are Canadian
*if people are teenagers (a time of gender intensification)

Looking at the results LaFrance et al. note that,"...the extent of sex differences in smiling is highly contingent on social groups and social factors" (p. 326). In simpler language, men tend to smile less than women, but when this happens and how obvious it is depends on the characteristics of the situation. For example, there are personal and cultural factors like age, race, and culture. There is also the question of what is required in this situation: do they have to persuade; argue; or be in charge of the care of another being? Who are they interacting with - do they share the same age, sex, or level of power in the situation?

Notably for Kristen Stewart the results also demonstrate that people are more likely to show this gender difference when they know that they are being watched, when they imagine that they are being watched, and when they feel like they need to make a good impression (or instead are facing embarrassment). So another reason that fans may be critical is that, by being an actor and a public figure, she is constantly in these contexts yet she does not do what most women would do in those situations, she does not smile. On the other hand, if Robert Pattinson reacts the same way on the red carpet he is actually doing what we expect men to do in those situations, so once again he escapes criticism. And that really bites.

Further Reading:

A pdf of the LaFrance et al. (2003) article can be accessed on Dr. Elizabeth (Betsy) Paluck's website.

 Kristen Stewart may wish to work on her smile - not for the fans - but for how smiling, even fake smiling, might help her deal with stress. Read the Association for Psychological Science (APS) coverage of research done by Kraft and Pressman (2012).

Kristen Stewart is not alone. Read Emily Matchar's article, "Memoirs of an Un-Smiling Woman," from The Atlantic.