Saturday, September 5, 2015

If we named our hurricanes differently, would it save lives?

If we named our hurricanes differently, would it save lives?

Picture this scenario: it’s mid-October and a weather alert pops on your screen, advising you that a hurricane is headed our way. It’s still four days away, and while they’re not altogether certain how powerful it will be, or where it may strike land, you’re thinking – what should I do to prepare, if anything? 

Now suppose you heard that the hurricane’s name was Jennifer.  But later you learned that it was named Jack.  Would it make a difference?

Apparently it would, according to researchers from the University of Illinois at Urbana-Champaign who maintain that people judge hurricane risk, in part, based on its name.  In their study “Why Have Female Hurricanes Killed More People than Male Ones?” they explain that the more feminine the name, the less likely people are to take preparatory action (note: their study was published in the Proceedings of the National Academy of Sciences).

Said the authors:  

“Meteorologists and geoscientists have called for greater consideration of social science factors that predict responses to natural hazards. We answer this call by highlighting the influence of an unexplored social factor, gender-based expectations, on the human toll of hurricanes that are assigned gendered names. Feminine-named hurricanes (vs. masculine-named hurricanes) cause significantly more deaths, apparently because they lead to lower perceived risk and consequently less preparedness. Using names such as Eloise or Charlie for referencing hurricanes has been thought by meteorologists to enhance the clarity and recall of storm information. We show that this practice also taps into well-developed and widely held gender stereotypes, with potentially deadly consequences.”

The authors’ conclusions been challenged on several counts, but their message is worth serious consideration: would we save lives if we named hurricanes based on their severity? In other words, when we decide whether to take action, for a coming storm, to what degree are we influenced by the relative femininity and masculinity of a hurricane’s name?

The authors’ analysis included 94 hurricanes that struck the U.S. between 1950 and 2012, recognizing that up until 1979, hurricanes were only given female names (for the dataset 1950-1978, the researchers did examine the relative femininity of the name).

The study was strongly criticized by social scientist Jeff Lazo from the National Centre for Atmospheric Research. According to an article by Ed Yong, on National Geographic’s web site, “[Lazo] thinks the pattern is most likely a statistical fluke which arose because of the ways in which the team analyzed their data.”

As the debates takes flight, few would disagree that that “men are linked to strength and aggression, and women with warmth and passivity,” according to Yong’s article. The question is: do these unconscious biases have real-life consequences in how we prepare for impending storms?


Said study author Sharon Shavitt, as quoted in Yong’s article: “It may make sense to move away from human names, but other labels could also create problems if they are associated with perceptions of mildness or gentleness. . . . The key is to provide information and labels that are relevant to the storm’s severity.”

##

Sunday, August 16, 2015

Did you ever notice that . . .

Did you ever notice that . . . you can tell if a person is left handed or right handed by the way that they clap?  Try it out at dinner tonight.  Ask your dinner mates to start clapping and notice which hand is active and which hand is passive (your passive hand “receives” the clap).  Now encourage them to clap the other way, that is, try making your non-dominant hand the active one.  Feels strange, no?

Some folks, of course, are “neutral clappers” (both hands meeting in the middle), but they’re a rare breed. So the next time you’re at a concert, or a play, take a peek at how people clap – you’ll be able to immediately tell if they’re lefty or righty (my wife Roe, by the way, is ambilevous: “having the ability to perform manual skill tasks with both hands”).

Did you ever notice that . . . when families are walking, the man walks ahead of the pack, while the mother trails the field?  Why is that?

I suppose it’s linked to DNA. After all, back in the day, it made sense to have the hunter-gatherer out in front – protecting the family, finding the next meal. But today, what’s the thinking?  I remember a field trip some years ago that my wife organized – an overnighter with 150 fifth graders accompanied by 25 chaperones (three of whom were male). No big surprise that one of the men (walking out ahead) lost track of one of his kids. 

So I wonder: do we still need to be out in front?

Did you ever notice that . . . in conversation, some people perpetually use pronouns instead of first names? (e.g., or “My wife and I are heading to Florida” instead of “Roe and I are heading to Florida”).

Granted, if we are meeting each other for the first time, and I don’t know your sister’s name, a pronoun makes perfect sense. But over time, I’d think that people would make the switch. Yet many don’t.  Why is that? Why do people – long after they know us, and they know that we know all the players – continue to use pronouns? Perhaps it says something about our personality, or, more simply, is just a verbal habit that mimics our parent’s conversational style. Either way, I must admit, it’s a mystery to me (as it happens, early in a relationship, I’ll make a point of introducing family members with both pronoun and first name (“my sister Ilene”), and then, as the conversation evolves, I’ll switch to the name alone. But I may be the odd one here).

##


Sunday, August 9, 2015

Brain-Training Exercises: Do They Work?

Brain-Training Exercises: Do They Work?

“Before investing time and money on brain games, consider what economists call opportunity costs: If an hour spent doing solo software drills is an hour not spent hiking, learning Italian, making a new recipe, or playing with your grandchildren, it may not be worth it. But if it replaces time spent in a sedentary state, like watching television, the choice may make more sense for you.” – Stanford Center on Longevity

They’re fun. They’re challenging. And they’re somewhat addicting.  But do they work? Do brain-training exercises – that is, sitting in front of a computer performing specific repetitive tasks – really improve cognitive function?

An active debate has emerged within the scientific community, with neuroscientists and behavioral psychologists analyzing first-generation data to assess whether brain-training exercises make a substantive difference in cognitive vigor.

This past fall the Stanford Center on Longevity, working with the Berlin Max Planck Institute for Human Development, issued a word of caution, urging consumers to be wary of exaggerated claims that brain-training exercises will significantly enhance brain function, and brain fitness.

And while the Center’s report, in this writer’s view, was quite balanced in assessing the new field, a group of 127 scientists, from 18 countries, took issue with several of the Center’s findings and crafted an Open Letter to share their concerns.  

What’s clear in all of this is that the “brain fitness” movement, in due time, may well resemble the thriving physical fitness movement.  Said Alvaro Fernandez, in a piece for the Huffington Post: “It took decades of conflicting research and confusing media coverage to finally spread the idea that daily life activities are far from sufficient to keep us physically fit . . . From those humble beginnings, health club memberships in 2014 amounted to $78+ billion dollars in annual revenues.” This same notion, many believe, will apply to brain fitness in the coming years. 

So what exactly did the Stanford Center report have to say?  And how did the 127 scientists, in their Open Letter respond? Some highlights:

Stanford Center on Longevity

·        “It would be appropriate to conclude . . . that the potential to learn new skills remains intact throughout the life span. However at this point it is not appropriate to conclude that training-induced changes go significantly beyond the learned skills, that they affect broad abilities with real-world relevance, or that they generally promote ‘brain health’.”

·        “These conclusions do not mean that the brain does not remain malleable, even in old age. Any mentally effortful new experience, such as learning a language, acquiring a motor skill, navigating in a new environment, and, yes, playing commercially available computer games, will produce changes in those neural systems that support acquisition of the new skill.”

·        “Some of the initial [research] results are promising and make further research highly desirable. However, at present, these findings do not provide a sound basis for the claims made by commercial companies selling brain games.”

·        “We also need to keep in mind opportunity costs. Time spent playing the games is time not spent reading, socializing, gardening, exercising, or engaging in many other activities that may benefit cognitive and physical health of older adults. Given that the effects of playing the games tend to be task-specific, it may be advisable to train an activity that by itself comes with benefits for everyday life. Another drawback of publicizing computer games as a fix to deteriorating cognitive performance is that it diverts attention and resources from prevention efforts. The promise of a magic bullet detracts from the message that cognitive vigor in old age, to the extent that it can be influenced by the lives we live, reflects the long-term effects of a healthy and active lifestyle.”

The 127 scientists respond:

·        “. . . [A] substantial and growing body of evidence shows that certain cognitive training regimens can significantly improve cognitive function, including in ways that generalize to everyday life.”

·        “Over three decades, researchers have built a huge body of evidence that brain plasticity is a lifelong phenomenon – as you acknowledge. However, the [Stanford Center] statement fails to acknowledge that this evidence was derived from training experiments directly documenting the improvement of sensory, cognitive, motor, and functional performance.”

Leading the Open Letter movement was Dr. Michael Merzenich, a member of both the National Academy of Sciences and the Institute of Medicine. Said Merzenich: “The authors of the Longevity Center statement properly concluded that a large body of work has shown there is plasticity throughout the brain and throughout life. . . . It was rather astounding, then, that this same group failed to notice that we proved that through hundreds of studies showing we can drive positive change in the brain through directed, intensive, computer-guided training. It’s silly that anyone would think that we can make cognitive training that works in labs, but not in people’s homes.”


##

Sunday, August 2, 2015

Geniuses in the office: more pain than pleasure?

Geniuses in the office: more pain than pleasure?

Have you ever worked for a genius?  I have. 

The year was 1979, and though the subsequent six years were emotionally painful (he had this unique ability to make you feel inept at every turn), there were positive results, economically speaking.

Those six years remain fresh in my mind, leading me to wonder, from time to time: if you work for a genius, should you bolt or should you stick?

That was the dominant thought that ran through my mind when I read Walter Issacson’s grueling biography of Steve Jobs. Jobs, of course, was a genius by everyone’s account, changing the course of five (yes, five) industries. Yet nearly every page of Issacson’s text revealed Jobs’ cruel and demonic treatment of colleagues. So I wondered: why did these people stick? 

Last week, I found the answer. 

It appeared on page 332 of Doris Kearns Goodwin’s glorious book “The Bully Pulpit” (which traces the intertwining lives of Presidents Teddy Roosevelt and William Taft).  The dynamic passage – authored by Ida Tarbell, the nation’s leading journalist a century ago – was written about Sam McClure, the nation’s leading publisher. But it could just have easily been written about Jobs, a century later.  Wrote Tarbell, of McClure (in a letter to colleagues):

“Never forget that it was he and nobody else who has created that place. . . He is a very extraordinary creature, you can’t put him into a machine and make him run smoothly with the other wheels and things. . . . . Able methodical people grow on every bush but genius comes once in a generation and if you ever get in its vicinity thank the Lord & stick.  You probably will be laid up now and then in a sanitarium recovering from the effort to follow him but that’s a small matter if you really get into touch finally with that wonderful brain. . . . If there was nothing in all this but the annoyance and uncertainty & confusion – that is, if there were no results – then we might rebel, but there are always results – vital ones. . . . The great schemes, the daring moves in that business have always been [his]. They will continue to be. His one hundredth idea is a stroke of genius. Be on hand to grasp that one hundredth idea.”

Hire or Fire?
So when it comes to genius, do we bolt, or do we stick?  Do we hire or do we fire?

The verdict is unclear.  Enthusiasts maintain that businesses need to actively recruit geniuses, in order to advance the organization. Others, however, insist that geniuses do more harm than good and should be led out to pasture.  Below are two contrasting views.  Take your pick.

Hire ‘Em – Dave Logan, writing for CBS Money Watch
Logan urges companies to hire geniuses, then learn to manage them. Logan acknowledges that often, as bright as geniuses are, they can be incredibly difficult to work with (Logan jokes: “. . . the chance that [the genius] will offend someone in a conservative culture is 100% - in the first week.” Nonetheless, Logan recommends that you pull the trigger, saying: “If the hiring manager knows the tradeoffs, they’ll often do the right thing for everyone by hiring the genius, and then working to minimize the deficits, or clean up messes when they happen.” 

Fire ‘Em – Scott Lowe, independent consultant, in an article for www.techrepublic.com
Says Lowe: “Eventually, when a serious attitude problem exists, it’s more than likely that you’ll need to fire the person for the sake of the rest of the team. . . For my own organization, I hire attitude first, skill second. . . . Look for people who fit our culture and have appropriate skills to do the job. . . . You can teach skills, but teaching attitude is much harder.”

Stick or bolt?  Hire or fire?  It just might take a genius to decide what to do.


##

Sunday, July 26, 2015

How do you describe yourself?

“We’re like onions, and we have layers.” – Shrek

How do you describe yourself?

Are you cautious or care-free? Decisive or indecisive? An introvert or extrovert?

Humans love to label, and labels fall easily from our lips. We label everything in our path – from habits and character traits to style, tempo and drive.

But why? What compels us to do this? Are we not, as human beings, dynamic creatures who are constantly evolving? If so, why stick a label on it?  Leading to this nagging question: do self-imposed labels serve us well?  Or do they limit our ability to change and grow?

In today’s high octane culture, enveloped by intense media scrutiny (of every one, and every thing!), it’s difficult not to classify (e.g., he’s lazy, she’s a math whiz, he’s so forgetful), not just others, but ourselves (I’m not a risk taker, I’m too shy).  But do they serve us well?

I vote no. I maintain that labels, by and large, inhibit our growth and impede our ability to make the most of our lives. They limit us. They restrict us. They make it more difficult for mid-course corrections.

Part of the problem, of course, is the binary nature of self-evaluation. It’s common to describe people – and ourselves – as introvert/extrovert, optimist/pessimist, Type A/Type B, Republican/ Democrat, high maintenance/low maintenance.  Why, pray tell, are there only two categories?

A search of the literature finds some breaks in the binary stranglehold (ambiverts are now a recognized category, and I recently discovered what it means to be a Type C and Type D personality). Even Myers-Briggs, which graciously offers us 16 categories, hinges on a binary framework – introversion/extroversion, sensing/intuition, thinking/feeling and judging/perceiving.

Aren’t humans a little more complex than that?

Which brings us to this brilliant piece written by a young woman who was asked, simply: “How do you classify yourself?” (her answer appeared on the website www.worldinconversation.org):

“I don’t like labels. In my opinion, people are people. Everyone has differences and everyone has similarities. As Shrek says, we’re like onions and we have layers. Why would anyone want to define himself or herself by just one of those layers? Labels have started wars, torn families apart, and caused heartache. I like to think that I’m made up of many things, and that just one thing doesn’t define me. However, for the sake of the prompt, these are what define me:

“I am a single white female. I am German and Hungarian. I am a sister, daughter, niece, granddaughter, great niece, and cousin. In the summer, I’m a child-care provider. The rest of the year, I’m a poor college kid. I’m a musician: I play trumpet, piano, and ukulele. I am a ballet, tap, and swing dancer. I’m a singer, an alto to be specific. I am a connoisseur of Lindor truffles, peach pie, Doritos, and raspberry smoothies. I am a lover of oldies’ music, high-heels, and vintage clothes. I’m an avid celebrity-rag reader. I’m a Jack Benny Program listener. I’m a collector of Smokey Bear paraphernalia, post-cards, Broadway show pins, and Snapple caps. I am a member of the Pennsylvania State Marching Blue band and the a capella group, Blue in the FACE. I am a world explorer.

“I am imperfect. I’m a crybaby. I am a complainer. I am a devil’s advocate. I’m a prep. I am indecisive. I’m a ‘goody-two-shoes.’ I’m a band geek. I’m a Harry Potter junkie. I am enthusiastic and loud when it’s not socially acceptable. I’m a lover of horribly written Meg Cabot novels like The Princess Diaries series. I’m a nervous giggler in inappropriate situations. I’m a too-cautious driver (and it caused me to have my first accident not too long ago)! I’m a grudge-holder when someone hurts my sisters.

“I am both an introvert and an extrovert given the right setting. I am a procrastinator and an over-achiever. I am a winter-lover and summer-baby. I love old people and toddlers. I love kids, but I don’t want kids. I both love and hate Walmart, but mostly hate. I’m am both an individualist and conformist.

“I am a left-wing bleeding-heart liberal. I’m a citizen of the world. I’m an advocate for the ONE Campaign, the campaign to make poverty history. I am pro-choice and anti-guns. I am a believer in the good in people. I am a Lutheran who believes in karma. I’m a registered democrat who has not missed an election yet. I am a strong-willed democrat. I am a fan of the Golden Rule. I’m a dreamer, pacifist, optimist. I’m a lover, not a fighter. I am nothing less than all of this. I am me.”

##


Saturday, July 18, 2015

At what age will your mental abilities peak?


At what age will your mental abilities peak?

Well, it’s not 24. 

The notion has persisted for generations – that the human brain’s cognitive abilities peak in the early 20s and then begin a slow march downhill. 

Look around. Think about musicians, salesmen, actors, lawyers, engineers, painters, directors, sculptors, psychologists and novelists. When do their skills peak? To what degree does experience factor into the equation?

A new study out of New England – focused exclusively on cognitive abilities – stands ready to up-end the long-held notions that young is, by definition, better. Study authors Joshua Hartshorne and Laura Germine found that, when it comes to thinking, there’s no magic age.  In fact, some skills (e.g., vocabulary recognition) don’t peak in humans until age 65 or 70.

Here’s a quick-look summary of their findings, drawn from nearly 50,000 online participants (note: Hartshorne is with MIT, Germine is a research associate in Harvard’s Psychology Dept. and a postdoctoral fellow at Harvard-affiliated MGH):
·        Ability to recognize and remember faces – this ability peaks between ages 30 and 34;
·        Mental processing speed – as one might expect, this skill peaks around age 18 or 19;
·        Social cognition (the ability to detect other people’s emotions) – peaks in the 40s to age 50, with no notable decline until after age 60; 
·        Short-term memory – peaks around age 25, levels off for several years, then begins to drop at age 35;
·        Crystalized intelligence (measured as vocabulary skills) rises as one ages, not peaking until about age 65 to 70.

Randy Dotinga, journalist and President of the American Society of Journalists and Authors, identified some weaknesses in the study. Writing for healthday.com, Dotinga noted that it’s not a longitudinal study but instead is based on a single point in time. Dotinga also pointed out that the study only included people who are Internet-savvy (although, the author acknowledges, the researchers did analyze statistics from studies that were not online). 

Nonetheless, the fundamental message is unassailable: human brains are not simple machines which, as they age, begin to deteriorate. Instead, brain plasticity is at work, throughout the life span. Noted a study summary at www.psychologicalscience.org: “It’s not yet clear why these skills tend to peak at different ages, but previous research suggests that it may have to do with changes in gene expression or brain structure as we age.”

When does creativity peak?
Dean Simonton, psychologist and UC Davis professor, has studied the phenomenon of creativity for nearly 40 years, and he explains that “research has consistently found that creativity is a curvilinear (inverted backward) function of age – meaning that older individuals would not be creative. However, the empirical and theoretical literature shows that such a pessimistic conclusion is unjustified. Numerous factors operate that help maintain creative output throughout the life span. Indeed, it is actually possible for creators to display a qualitative and quantitative resurgence of creativity in their final years.” Simonton goes on to note that a range of professions – among them poets and painters – have their most productive and prolific years well past what we commonly call “middle age.”

So, when will your mental abilities peak? Well, it’s certainly not 24. 


##

Sunday, July 5, 2015

When you praise someone (or yourself), are you doing it right?

When you praise someone (or yourself), are you doing it right?

Let’s start with some key research findings on the delicate art of praise:

·        Overly positive praise can backfire, leading children (particularly those with low self-esteem) to back away from future challenges;

·        Given the choice, use process praise (“You did a wonderful job”) instead of person praise (“You’re so smart”).  And here’s why, according to an article written by the Society for Research in Child Development (SRCD): “. . . [P]rocess praise sends the message that effort and actions are the sources of success, leading children to believe they can improve their performance through hard work. Person praise sends the opposite message—that the child’s ability is fixed.”

·        When praising a child, it’s important to avoid the word “incredible”;

·        Parents deliver more process praise to boys than girls; and

·        Inappropriate self-praise can have negative effects.

In study after study, the overriding message is clear: honest, realistic praise (whether given to others, or oneself) is desirable.  So choose your words, and your internal thoughts, carefully. 

Process Praise vs. Person Praise
What’s the difference?  Said SCRD, in their article posted at www.psypost.org: “. . . [W]hen parents praise the effort children make, it leads children to be more persistent and perform better on challenging tasks, while person praise (praising the individual) leads children to be less persistent and perform worse on such tasks.”

In one longitudinal study, led by Assistant Professor of Psychology Elizabeth Gunderson (then with the University of Chicago), researchers examined the relationship between praise and challenge-seeking, in toddlers ages one to three years old.  They found that children who were praised for their effort (as opposed to praised as individuals) had a more positive approach to challenges just five years later. Said Gunderson, quoted in a psypost.org article: “This study suggests that improving the quality of parents’ praise in the toddler years may help children develop the belief that people can change and that challenging tasks provide opportunities to learn.”

Avoid Inflated Praise (and the word “incredible”)
What constitutes inflated praise?  Often it’s the word “incredible” (e.g., Inflated praise: “You made an incredibly beautiful drawing!” Non-inflated praise: “You made a beautiful drawing!”).

Said Utretcht University psychologist Eddie Brummelman, as quoted in an article at www.psychologicalscience.org: “Inflated praise, although well-intended, may cause children with low self-esteem to avoid crucial learning experiences.”  The article continued: “Specifically, the researchers write, rave reviews for a mundane accomplishment can convey an unintended message: Now that you’ve excelled, we’re going to hold you to a very high standard. Since youngsters with low self-esteem are driven by a desire to avoid failure, this can prompt them to avoid challenges.”

Girls vs. Boys
Gunderson’s longitudinal study (cited earlier) found that boys and girls receive the same amount of praise overall, but that boys receive “significantly” more process praise than girls. Not surprisingly, said the researchers, “boys were more likely to have positive attitudes about academic challenges than girls and to believe that intelligence could be improved,” according to the SRCD article. The article quoted Gunderson, who said: “These results are cause for concern because they suggest that parents may be inadvertently creating the mindset among girls that traits are fixed, leading to decreased motivation and persistence in the face of challenges and setbacks.”

Praising Yourself
Research led by Young-Hoon Kim, PhD, of the University of Pennsylvania, found that it’s important for adults to accurately assess their performance, and that falsely boosting their self-esteem can have unintended negative consequences.

According to a press release from the American Psychological Association: “People who try to boost their self-esteem by telling themselves they’ve done a great job, when they haven’t, could end up feeling dejected instead.” Said lead author Kim, as quoted in the APA release: “These findings challenge the popular notion that self-enhancement and providing positive performance feedback to low performers is beneficial to emotional health. Instead, our results underscore the emotional benefits of accurate self-assessments and performance feedback.”

Added co-author Chi-Yue Chiu, of Nanyang Technological University in Singapore: “Distress following excessive self-praise is likely to occur when a person's inadequacy is exposed, and because inaccurate self-assessments can prevent self-improvement.” The study involved young people from both the U.S. and Hong Kong.


##