Sunday, November 8, 2015

How Many Bumps Will You Have Today?

How Many Bumps Will You Have Today?

“We can’t have a crisis tomorrow.  My schedule is already full.” – Henry Kissinger

We expect so much. We expect our hours and days to run smoothly, when indeed we know better.

I call it the “Theory of Six Bumps,” and it’s a simple premise: each day will surprise us with six bumps, that is, events that we don’t anticipate but must deal with nonetheless. The dog gets sick and does her business on the carpet (bump #1) . . . a friend desperately needs a ride to drop off his car (bump #2) . . . on our way to the office, we realize that we forgot one critical piece for the afternoon meeting (bump #3).

Bumps, of course, come in all shapes and sizes (misplaced keys, traffic at a standstill on 526), and if we’re lucky, most bumps will be insignificant. Certain ones will take over our lives for a stretch (e.g., sickness and weather-related tragedies), but the true casualty is our dream for a bump-free day. Perhaps it’s time to revise that dream. Instead of anticipating a day free of mishaps (hope springs eternal), embrace the notion that, each day, they’ll be six bumps (some more demanding than others).  Rest assured, they’re coming.

Some time ago I shared the Theory of Six Bumps with my sister, and less than 48 hours later she called and reported, “Well, I’ve already had my six bumps today.” It was 10:30 in the morning (ouch!). As best I can recall, the bumps involved a parking ticket, a broken coffee pot and a computer glitch, a challenging mix if you asked me. I immediately thought to myself, then shared with my sister: “Well, it looks like you’re clear for the rest of the day.” Fortunately she laughed, then recounted the frustrating details of her morning.

Some days I give voice to my bumps. Rushing to a morning meeting, I spot a traffic jam up ahead. Inside I’m thinking: “Bump #1”. Hours later, on the checkout line, I discover that I don’t have my credit card (bump #2) because I gave it to my daughter (an altogether different kind of bump). Once home, I realize that I’ve neglected to shut off the outside water valve and discover that the water line has burst (bump #3, and a sizable one at that).

Of late, one particular bump stands out: it was Saturday morning, nearing 9:30am, when Roe and I heard a strange noise emanating from the air vent.  Within minutes we realized that an animal was trapped in the duct system and as my frustration began to mount, that THIS Saturday was about to disappear, I smiled to myself and said aloud: “Ah, bump #1.”  Six hours later the problem was resolved (we’ll call this a “multi-hour bump”). And while frustration was still a part of my profile, the recognition helped immensely.

Might a bump-free day lie in your future?  Not likely. So just sit back and relax – and count ‘em if you wish.  But know this – the bumps will arrive again tomorrow . . . and the day after that.

Embrace ’em.  It’ll make life a heckuva lot easier.


##

Sunday, November 1, 2015

And Now, the Good News (part 1)

And Now, the Good News (part 1)

It’s hard to ignore the endless stream of frightening news from around the globe (of late, university shootings, hotel attacks, the refugee crisis in Europe, the ongoing death spiral in South Sudan – just to name a few).  It’s enough to make you think that the world is in a bit of a free fall. 

And, despite our best intentions, with 24/7 news sources delivering wall-to-wall coverage of the dark side, it’s difficult not to be sucked in, to believe that the evil element is about to strike close to home tomorrow, or the day after. Of course, there are legitimate threats. But millions of people across the globe, every day, are taking active steps to improve the planet.  And many of them are succeeding. 

Part I of our series explores one of the most compelling issue of our time, or any time – world hunger. And we’re not just talking about the crises in less-developed countries. Mind-numbing statistics from a recent book (“Hunger: The Biology and Politics of Starvation,” by Butterly and Shepherd) points out that, currently, 49 million people in the United States (roughly 15% of all U.S. households) do not have enough to eat each day, and suffer from recurring hunger.  The challenge exists globally, but a detailed U.N. report released earlier this year maintains that, based on steady progress in recent years, world hunger could conceivably end in our lifetime. To that end, www.bit-of-news.com titled a recent report: “Generation Y may be history’s first zero-hunger generation.” Added Rachel Zelon, executive director of Hunger Relief International: ““This is an engaged generation,” she said. “They want to be a part of the solution.”

Some noteworthy facts – and reminders – from the UN report (titled: “The State of Food Insecurity in the World 2015”): 

·        In the 22-year span between 2990 and 2012, “the number of hungry declined to 795 million,” down from 1 billion (this 21% decline took place at a time when world population rose by 35%, from 5.2 billion to 7 billion).

·        The decline results from “an increasing number of countries [hitting] their MDG (Millennium Development Goal) targets on hunger,” according to the UN report. Said Jose Graziano da Silva, Director General of the FAO (Food and Agriculture Organization): ““The near-achievement of the MDG hunger targets shows us that we can indeed eliminate the scourge of hunger in our lifetime.” Of the 129 countries involved, 72 achieved their MDG goals for 2015, “with developing regions as a whole missing the target by a small margin,” added the UN report.

·        Said the UN Report: “. . .  large reductions in hunger were achieved in East Asia and very fast progress was posted in Latin America and the Caribbean, southeast and central Asia, as well as some parts of Africa, showing that inclusive economic growth, agricultural investments and social protection, along with political stability makes the elimination of hunger possible.” Amid this progress, worldhunger.org adds this cautionary note: “[T]here has been the least progress in the sub- Saharan region, where more than one in four people remain undernourished – the highest prevalence of any region in the world.”

·        And another cautionary note, from the same UN report.  Despite great strides in reducing global hunger, they said, progress towards achieving the 2015 food security targets “was hampered in recent years by challenging global economic conditions as well as extreme weather events, natural disasters, political instability and civil strife.” Along these lines, the UN’s World Food Programme points out that: “Good progress was made in reducing chronic hunger in the 1980s and the 1990s, but progress began to level off between 2000 and 2010.” They added: “All of us – citizens, employers, corporate leaders and governments – must work together to end hunger.”


##

Sunday, October 25, 2015

How intuitive are you?

Definition of Intuition: “The capacity for direct knowledge and immediate insight, without any observation or reason.” – David G. Myers, PhD

How intuitive are you? 

I have a hunch that you’ll enjoy this article.

We’ll cover incubation, blind readings, inner voices, ways to improve your intuition and when not to trust it.

And how real it is? Said neurology professor Antoine Bechara, as quoted in an article by Sarah Mahoney for prevention.com: "People treat intuition like it's a dirty word, but it's actually one of the body's survival mechanisms. . . . It's a means of taking you away from danger and steering you toward what is good for you."

We took the quiz!

My wife (Roe) and I took a 22 question survey on intuition, and no surprise – she’s far more intuitive that I am (she outscored me 14-8 so I’ll continue, as I have for 38 years, to defer to her instincts).  And, no surprise again, surveys repeatedly find that women are more intuitive than men, that is, better at decoding human emotions. Said Mahoney: “. . . [T]here's plenty of evidence that women have a bit of an edge: For example . . . when shown pictures of couples, women are better at predicting which are phony and which are real. And in photos of coworkers, women are more likely to discern which one is the other's supervisor.”

In that same article, Bechara points out that: "Intuition is most useful in ambiguous, complex decisions [and] least useful in areas where the outcomes are predictable." Added Mahoney: “So if you're deciding if you should marry or whether to take that job in Boston, use your gut. Buying real estate or deciding whether to go through with that knee surgery? Check your intuition at the door, and listen to the numbers.”

Ways to improve it (incubation, a blind reading)
Researchers at the University of Minnesota, noting that “intuition is a natural process,” recommends spending more time in nature, keeping an Intuition Diary, finding an Intuitive Buddy and ONLY relying on intuition when you’re dealing with “real problems and situations.” The web site beliefnet.com explains how your intuitive powers can be honed with practice, awareness, meditative techniques, imagery, dreams, affirmations and the like. 

And then there are these nontraditional methods: incubation and a blind reading. On incubation, author Philip Goldberg (“The Intuitive Edge”) explains: “Add a shot of intuition to your daily analysis. Some people thrive on data. That's fine, but give yourself a definite cutoff point for analysis and then try a trick that psychologists call incubation: Give yourself a fun distraction such as doing a puzzle or reading before making your final decision. This will allow your intuition to play a role.”

And a blind reading? Karen Hogan, writing for life.gaiam.com, provides this quick step-by-step:
1.      “Sit down at a writing table with three blank index cards.
2.      “Think about a decision you are currently grappling with and write three solutions for it, one on each card.
3.      “Turn the cards blank-side-up, shuffle them and place them face-down on a table.
4.      “Run your hands over the cards and notice the feeling of each card.
5.      “Assign a percentage to each card based on how powerfully you’re drawn to it.
6.      “Turn the cards over and take note of the answer with the highest percentage.”

When NOT to trust your intuition
There are plenty of occasions, psychologists’ report, when you should be wary, here are two.
1.      Test-taking. Studies have disproved the notion that “your first guess is your best guess;” and
2.      Worry. You’re thinking to yourself: “I’m so worried about ________, something must be wrong.” Said Mahoney: “People who worry excessively often confuse general anxiety with a specific fear. Researchers say that such fretting may feel like intuition but is just anxiety in disguise.”

##

Thursday, October 22, 2015

Superheroes: more good than harm?

Superheroes: more good than harm?

When I was a young boy, I had just one hero – Superman.  He possessed all of the qualities that our culture admires – he was honest, trustworthy, and helpful.  Of course, he also had x-ray vision and could bend steel with his bare hands.  Oh, and he could fly.  Did I mention that?

Back then, he was the only superhero on the block – he didn’t have to compete with Spiderman, the Hulk, Iron Man, Captain America, Wonder Woman, Thor, or any number of new heroes who fill our screens. From my 10-year-old vantage point, Superman’s sole purpose (or so it seemed) was to take out the bad guys and “restore truth and justice, the American way.” Great work, if you can get it. 

Today, of course, our young ones have a bevy of superheroes from whom to choose – and their makeup and personalities are wide-ranging.  Reel back, for a moment, to 1951 when the first superhero movie appeared (yes, it was Superman).  It took 15 years before another superhero hit the big screen – Batman. Flash forward to the modern era when in 2013 there were seven superhero movies produced (Iron Man 3 is the highest grossing of the lot, having now pulled in more than $1.2 billion). And 2014-2015 is keeping pace. 

The recent superhero surge has re-ignited the debate about whether media violence (TV, movies, video games) fosters more violence in our society. The key question: does their presence lead young boys to be more aggressive or, instead, does superhero fantasy/worship build confidence and a stronger moral culture, one bent on helping others and taking out the bad guys? 

Psychologists line up on both sides of this aisle.  Let’s listen in.

Psychologist Sharon Lamb
In a piece written by Pam Willenz’s piece for eurekaalert.org, Lamb explains: "There is a big difference in the movie superhero of today and the comic book superhero of yesterday. . . . Today's superhero is too much like an action hero who participates in non-stop violence; he's aggressive, sarcastic and rarely speaks to the virtue of doing good for humanity. When not in superhero costume, these men, like Ironman, exploit women, flaunt bling and convey their manhood with high-powered guns." 

Psychologist Michael Thompson
Thompson takes the opposite view, as quoted in an article at comicsalliance.com: “The media has provided boys with particular superheroes to believe in and to attach their fantasies to, but the impulse to be a superhero is innate.”  In the article, Thompson added that similar themes have existed “at least since Homer. . . . So I just see boy play as mythic battling.” In a related PBS article, Thompson was quoted as saying: “[While] all boys have normal aggressive impulses which they learn to control, only a small percentage are overly aggressive and have chronic difficulty controlling those impulses.”

Psychologist Robin Rosenberg (editor of the Psychology of Superheroes and author of “Superhero Origins: What Makes Superheroes Tick and Why We Care”)
Writing for psychologytoday.com, Rosenberg aligns with Thompson when she explains: “Flying like Superman in virtual reality can make you more helpful in real life. That's what my colleagues and I found in a recent study” at Stanford's Virtual Human Interaction Lab.

Psychologist Melanie Hargill
Hargill travels the middle road in an article she authored for kidzworld.co.za.  Said Hartgill: “. . . the average child spends more than 50% of their time out of school in front of the television and your average superhero program contains 32 acts of violence in a one-hour show, so when you start doing the math, that's a lot of violence being seen on a regular basis for many of our children. . . . [But] letting your child watch superhero programs on TV is not necessarily all negative providing you are aware of what they are watching and you discuss it with them. . . . Certainly by age 7, children should be able to distinguish between reality and fantasy and also truly understand the difference.  Whereas some children do this younger than 7, it is unlikely to occur under the age of 5.”

##


LIFE LESSONS FROM SUPERHEROES*

Batman: Anyone can be a hero. Batman shows you don't have to be born with superpowers to be a hero. Bruce Wayne can't fly. He's not part-god. He just fights bad guys.

Power Rangers: Teamwork is essential. If you're going to defeat evil you need to work together. Although there is a leader, all of the Rangers need to work as a team.

The Hulk: Control your temper. Mr Green is a good guy until he gets angry. The message to kids? Keep that temper under control or it could get you into trouble.

Spider-Man: Be responsible. As Peter Parker's Uncle Ben says: "With great power comes great responsibility."

Superman: One man can make a difference. He might work alone, but he does what he can to make a difference.

Iron Man: No one is perfect. Tony Stark lacks discipline but he tries hard to overcome the worst parts of his personality with his genius mind and good intentions.


*drawn from an article written by Rachel Lewis for www.thenational.ae.  In the article, Lewis interviewed child development psychologist Naeema Jiwani from the Human Relations Institute in the United Arab Emirates.  

Sunday, October 18, 2015

What’s the best way to influence a teenager?

What’s the best way to influence a teenager?

If you’re raising a teenager, you already know the challenge. And the dangers: careless driving, alcohol and drug abuse, unprotected sex. Helping them make good decisions – that is, helping them accurately assess the short- and long-term risks – seems a distant dream. But a recent study hints at a fresh approach that may influence their behavior. 

And the message is simple: focus on positive, not negative, outcomes. For example, if your teenager has taken up smoking, it’ll be more helpful to emphasize the benefits of stopping (“you’ll have more money, and better skin”) than the potential long-term negative consequences (“you’ll get lung cancer”).  Similarly, when trying to influence teens to cut back on alcohol and drug use, it may be more effective to emphasize improved sports performance than the long-term health risks.

In a press release, the authors explained: “. . . People have a natural tendency to ignore negative information when making decisions, a trait that may be particularly pertinent to young people, who tend to engage in more risky and dangerous behavior.” The study, conducted by researchers from University College London in the UK, was published in the Proceedings of the National Academy of Sciences and was funded the Wellcome Trust and the Royal Society.

The study’s findings, said study author Dr. Christina Moutsiana, “could help to explain the limited impact of campaigns targeted at young people to highlight the dangers of careless driving, unprotected sex, alcohol and drug abuse, and other risky behaviors."

Added co-author Dr. Tali Sharot: "Our findings show that if you want to get young people to better learn about the risks associated with their choices, you might want to focus on the benefits that a positive change would bring rather than hounding them with horror stories."

In the study’s introductory remarks, the authors provided this broad overview: “Human decision making is markedly influenced by beliefs of what might occur in the future. We form and update those beliefs based on information we receive from the world around us. However, even when we are presented with accurate information, cognitive biases and heuristics restrict our ability to make adequate adjustments to our prior beliefs.”

How was the study conducted? Participants, ages 9 to 26, were asked to assess the relative dangers of potential adverse life events (e.g., car accident, getting lung disease). The researchers then showed participants the actual statistics for these events and noted how each person adjusted their belief, after learning that the risk was higher or lower than they had estimated.  The bottom line: when it’s good news, our beliefs change; when it’s bad, news, not so much.  

Said the study authors: “The results show that younger participants were less likely to learn from information that shows them that the future is bleaker than expected. In other words, even when they know the risks, they have difficulties using that information if it's worse than they thought it would be. By contrast, the ability to learn from good news remained stable across all ages.”

While buoyed by the findings, Dr. Moutsiana offered this cautionary note.  She told Medical News Today that “while positive messages about not smoking might be more effective than negative messages, other factors, such as social pressure, need to be considered in why teenagers smoke.”  Added Moutsiana, in the Medical News Today article: “"We used events related more to physical danger. . . . It is possible that events that relate more to social pressure might have a different effect. Therefore it needs to be examined in control experiments."

##

Sunday, October 4, 2015

Technology Myths (Part 1): which ones are true?

Technology Myths (Part 1): which ones are true?

Finding it hard to keep up?  Me too.  Overwhelmed by new products and new apps for our technical toys, it’s (often) hard to separate myth from magic.  So today we present part 1 of a series we’ll call  Technology Myths, and our debut will tackle five technology related aphorisms. Your job (should you choose to accept it) is to decide which ones, if any, are true. So we ask you:    

1: Is a camera with more megapixels always better?
2: Do more bars on your cell phone mean better service?
3: If you have a larger monitor, will you be more productive?
4. Are Apple computers immune from viruses? and
5. Do high-priced HDMI cables significantly enhance TV quality?

1: Cameras – Is a camera with more megapixels always better?
Certainly not, as the focus should be on sensor size, not megapixels. Explains Melanie Pinola, writing for www.popularmechanics.com: “It's true that more megapixels means more detail in larger photos. That detail, though, depends not just on pixel count but also on the camera's sensor: The larger it is, the more light data it can pick up, and the more detailed your images will be. If you add megapixels without increasing the overall size of the sensor, you reduce the amount of light reaching each pixel. Your point-and-shoot camera may have 20 megapixels, but if its sensor is the size of a pinhead, your photos won't look so great.” 

2: Signal strength – Do more bars on your cell phone mean better service?
Apparently, how well your Smartphone performs has more to do with congestion than connectivity.  A report in PC World, summarized by Patrick Miller at nbcnews.com, explains: “The signal bars on your cell phone display indicate the strength of your cellular signal to the nearest tower. But if you're connected to a tower that lots of other people are connected to, you could have a strong signal and still have poor service, since everyone's calls are competing for scarce network resources.” The PC World report pointed out that in their 2009 test of 3G service, “signals bars were poor indicators of service quality in 12 of the 13 cities.”  Added Pinola, in her article at www.popularmechanics.com: “It might take about a square block of people in Manhattan to overload a single cell tower, whereas in Wyoming, it would take a population spread over 15 square miles.”

3: Monitors – If you have a larger monitor, will you be more productive?
Back in 2008, when huge computer monitors were coming into vogue, a study from the University of Utah claimed that worker productivity rose by 30-50% with widescreen displays. But PC World, again quoted by Miller, dug into the data and found a few caveats.  They explained: “. . . the study also found a point of diminishing returns. Productivity gains fall in a bell-curve distribution once you hit a certain amount of screen space. For a single-monitor setup, over 26 inches is too much, while dual-display gains top out at 22 inches.” In addition, the study found that, when selected a second monitor, personal preference (e.g., “I know I’ll be more productive if I have that 24” display”) did not necessarily correlate with heightened performance.

#4: Macs – Apple computers are immune from viruses
The notion that hackers spend more time attacking Windows-based machines is unassailable.  But according to Wolpin, at blogspot.laptopmag.com, Macs aren’t virus-proof. Said Wolpin: “According to anti-virus software-maker Sophos, based on a study of 100,000 of its users, one in every five Macs carry some sort of malware — these Macs aren’t infected, but they carry malware in much of the same way humans carry dormant viruses such as chicken pox.”


#5: HDMI – high priced HDMI cables significantly enhance TV quality
Beware of those high priced HDMI cables. Apparently, they don’t make a bit of difference. Explains Pinola, in her piece for www.popularmechanics.com:  “Premium-cable manufacturers would have you believe gold-plated connectors and ‘high-density triple-layer metal-to-metal shielding’ give you a better signal and, therefore, the ultimate picture and audio performance. But generic—and cheaper—cables will deliver the same picture and audio quality. Signal over an HDMI cable is digital; it either comes through or it doesn't.”

Said Miller, who summarized PC World’s findings in his NBC report: “High-quality cables have been a staple of the audio/video business for decades now, and for good reason: As an analog audio or video signal travels from one device to another, it's susceptible to interference and disruption, meaning that the image data as it leaves your DVD player isn't 100 percent identical to the image that shows up on your TV, because certain parts of the signal can get lost on the way there.

However, digital audio/video standards like DisplayPort, DVI, and HDMI don't have this problem because the data being transmitted over the cable isn't as sensitive as an analog signal; it consists entirely of ones and zeros, and a tremendous drop in signal voltage has to occur before a one starts to look like a zero at the receiving end.”

One final note, according to Pinola: if you’re buying a cable over 6 feet long, a higher quality product can make a difference.  So keep that in mind. 


##

Saturday, September 26, 2015

How Many Friends Do You Need?

How Many Friends Do You Need?

When I posed this question to my good friend Scott (a clinical psychologist, no less), he answered glibly: “All of them.”  I laughed, easily agreed, then shared the relevant research.

The number, apparently, is 3 to 5.

But a number of equal interest is Dunbar’s 150 – that is, the number of stable relationships that a human being can comfortably manage at one time. The figure is the brainchild of British anthropologist and evolutionary psychology Robin Dunbar who made the claim some 20 years ago. Two decades later, his analysis has yet to face serious challenge. Dunbar’s theory: the number of relationships that an animal can manage (humans included) is directly related to the size of their brain.

But today, with a swath of tools available to facilitate human connection, the question begs: “Are today’s virtual sharing networks expanding our social circle, and the quality of our relationships?” It’s a lively debate.

We’ll re-visit the debate in a moment, but first a bit about Dunbar, and his numeric hypothesis:
  • 150 = casual friends, that is, the average number of people that, if you ran into them at a bar, you would feel comfortable having an impromptu drink with;
  • 50 = close friends, the ones you might invite to a group dinner;
  • 15 = supportive friends, the ones in whom you confide and turn to for support; and
  • 5 = your close support group, that is, your best friends (often family members).

 Dunbar’s central thesis is this: historically, group sizes are remarkably stable. One of his favorite examples: throughout history, company sizes in professional armies have always hovered around 150 (from the Roman Empire to 16th century Spain to 20th century Soviet Union).

Which brings us back to today’s tech tools, and our central question: do today’s tools strengthen or weaken our social connections? The verdict is out.

What’s not in doubt is the importance of friendship. Writes Markham Heid, in a piece for time.com, “. . . [R]esearch has shown that socially isolated people are more than twice as likely to die from heart disease as those with a solid social circle.” And sadly, Heid adds: “The number of Americans who say they have no close friends has roughly tripled in recent decades. . . . ‘Zero’ is also the most common response when people are asked how many confidants they have . . . . And adult men seem to be especially bad at keeping and cultivating friendships.”

Is the Internet expanding our social circle?

Dunbar himself is uncertain what will become of the current generation’s social ties, telling Maria Konnikova, who interviewed him for a piece in The New Yorker: “I don’t think we have enough evidence to argue either way. . . . It’s quite conceivable that we might end up less social in the future, which would be a disaster because we need to be more social—our world has become so large.”

Added Dunbar, again quoted by Konnikova: “The amount of social capital you have is pretty fixed. . . . It involves time investment. If you garner connections with more people you end up distributing your fixed amount of social capital more thinly so the average capital per person is lower . . . . Traditionally, it’s a sixty-forty split of attention: we spend 60% of our time with our core groups of fifty, fifteen, and five, and 40% with the larger spheres. Social networks may be growing our base, and, in the process, reversing that balance.”

And then there’s this noteworthy Dunbar observation, shared in an interview with Bloomberg.com: “In the end, we rely heavily on touch and we still haven't figured out how to do virtual touch. Maybe once we can do that we will have cracked a big nut.”

Dave Morin, one of Path’s co-founders, agrees with Dunbar that, as of yet, the new tools (e.g.,  Facebook and Twitter) have not expanded our social universe. Said Morin: “Dunbar’s work has helped to crystallize a debate among social media architects over whether even the most cleverly designed technologies can expand the dimensions of a person’s social world. As he puts it, ‘The question is, ‘Does digital technology in general allow you to retain the old friends as well as the new ones and therefore increase the size of your social circle?’ The answer seems to be a resounding no, at least for the moment.”

Is the Internet simply the latest utility supporting social life?

Remember the telephone? When it was introduced more than 160 years ago a common fear was its negative impact on human connection. After all, people would no longer be face-to-face, so the theory went, and social ties would suffer. Fast forward to 2015 and these same fears are surfacing. Today’s elders ask: With their heads buried in their phones, what will become of the Smartphone Generation?

Robert Cannon, FCC counsel and founder of Cybertelecom, thinks it’s an empty concern. Said Cannon: “The tension between the net and social engagement will vaporize in much the same way that thoughts about the telephone network vaporized and it came to be taken-for-granted. People do not ask if the telephone is an alienating social force. The phone is a utility supporting social life. Likewise, the net will come to be assumed as a utility for social life. How else would I know when church starts, when the game begins, where we are meeting for drinks, or what the weather for our trip might be?”

Karl Auerbach, CTO at InterWorking Labs agrees: “Ships and airplanes can be argued to be tools engendering either separation or closeness. Why should the internet be any different? I am continuously amazed at the ability of people to adapt the net to improve their interpersonal links.”

Good force or evil?  Naturally, time will tell. For now, though, I think I’ll shoot my friend(s) a quick text.

##

Saturday, September 5, 2015

If we named our hurricanes differently, would it save lives?

If we named our hurricanes differently, would it save lives?

Picture this scenario: it’s mid-October and a weather alert pops on your screen, advising you that a hurricane is headed our way. It’s still four days away, and while they’re not altogether certain how powerful it will be, or where it may strike land, you’re thinking – what should I do to prepare, if anything? 

Now suppose you heard that the hurricane’s name was Jennifer.  But later you learned that it was named Jack.  Would it make a difference?

Apparently it would, according to researchers from the University of Illinois at Urbana-Champaign who maintain that people judge hurricane risk, in part, based on its name.  In their study “Why Have Female Hurricanes Killed More People than Male Ones?” they explain that the more feminine the name, the less likely people are to take preparatory action (note: their study was published in the Proceedings of the National Academy of Sciences).

Said the authors:  

“Meteorologists and geoscientists have called for greater consideration of social science factors that predict responses to natural hazards. We answer this call by highlighting the influence of an unexplored social factor, gender-based expectations, on the human toll of hurricanes that are assigned gendered names. Feminine-named hurricanes (vs. masculine-named hurricanes) cause significantly more deaths, apparently because they lead to lower perceived risk and consequently less preparedness. Using names such as Eloise or Charlie for referencing hurricanes has been thought by meteorologists to enhance the clarity and recall of storm information. We show that this practice also taps into well-developed and widely held gender stereotypes, with potentially deadly consequences.”

The authors’ conclusions been challenged on several counts, but their message is worth serious consideration: would we save lives if we named hurricanes based on their severity? In other words, when we decide whether to take action, for a coming storm, to what degree are we influenced by the relative femininity and masculinity of a hurricane’s name?

The authors’ analysis included 94 hurricanes that struck the U.S. between 1950 and 2012, recognizing that up until 1979, hurricanes were only given female names (for the dataset 1950-1978, the researchers did examine the relative femininity of the name).

The study was strongly criticized by social scientist Jeff Lazo from the National Centre for Atmospheric Research. According to an article by Ed Yong, on National Geographic’s web site, “[Lazo] thinks the pattern is most likely a statistical fluke which arose because of the ways in which the team analyzed their data.”

As the debates takes flight, few would disagree that that “men are linked to strength and aggression, and women with warmth and passivity,” according to Yong’s article. The question is: do these unconscious biases have real-life consequences in how we prepare for impending storms?


Said study author Sharon Shavitt, as quoted in Yong’s article: “It may make sense to move away from human names, but other labels could also create problems if they are associated with perceptions of mildness or gentleness. . . . The key is to provide information and labels that are relevant to the storm’s severity.”

##

Sunday, August 16, 2015

Did you ever notice that . . .

Did you ever notice that . . . you can tell if a person is left handed or right handed by the way that they clap?  Try it out at dinner tonight.  Ask your dinner mates to start clapping and notice which hand is active and which hand is passive (your passive hand “receives” the clap).  Now encourage them to clap the other way, that is, try making your non-dominant hand the active one.  Feels strange, no?

Some folks, of course, are “neutral clappers” (both hands meeting in the middle), but they’re a rare breed. So the next time you’re at a concert, or a play, take a peek at how people clap – you’ll be able to immediately tell if they’re lefty or righty (my wife Roe, by the way, is ambilevous: “having the ability to perform manual skill tasks with both hands”).

Did you ever notice that . . . when families are walking, the man walks ahead of the pack, while the mother trails the field?  Why is that?

I suppose it’s linked to DNA. After all, back in the day, it made sense to have the hunter-gatherer out in front – protecting the family, finding the next meal. But today, what’s the thinking?  I remember a field trip some years ago that my wife organized – an overnighter with 150 fifth graders accompanied by 25 chaperones (three of whom were male). No big surprise that one of the men (walking out ahead) lost track of one of his kids. 

So I wonder: do we still need to be out in front?

Did you ever notice that . . . in conversation, some people perpetually use pronouns instead of first names? (e.g., or “My wife and I are heading to Florida” instead of “Roe and I are heading to Florida”).

Granted, if we are meeting each other for the first time, and I don’t know your sister’s name, a pronoun makes perfect sense. But over time, I’d think that people would make the switch. Yet many don’t.  Why is that? Why do people – long after they know us, and they know that we know all the players – continue to use pronouns? Perhaps it says something about our personality, or, more simply, is just a verbal habit that mimics our parent’s conversational style. Either way, I must admit, it’s a mystery to me (as it happens, early in a relationship, I’ll make a point of introducing family members with both pronoun and first name (“my sister Ilene”), and then, as the conversation evolves, I’ll switch to the name alone. But I may be the odd one here).

##


Sunday, August 9, 2015

Brain-Training Exercises: Do They Work?

Brain-Training Exercises: Do They Work?

“Before investing time and money on brain games, consider what economists call opportunity costs: If an hour spent doing solo software drills is an hour not spent hiking, learning Italian, making a new recipe, or playing with your grandchildren, it may not be worth it. But if it replaces time spent in a sedentary state, like watching television, the choice may make more sense for you.” – Stanford Center on Longevity

They’re fun. They’re challenging. And they’re somewhat addicting.  But do they work? Do brain-training exercises – that is, sitting in front of a computer performing specific repetitive tasks – really improve cognitive function?

An active debate has emerged within the scientific community, with neuroscientists and behavioral psychologists analyzing first-generation data to assess whether brain-training exercises make a substantive difference in cognitive vigor.

This past fall the Stanford Center on Longevity, working with the Berlin Max Planck Institute for Human Development, issued a word of caution, urging consumers to be wary of exaggerated claims that brain-training exercises will significantly enhance brain function, and brain fitness.

And while the Center’s report, in this writer’s view, was quite balanced in assessing the new field, a group of 127 scientists, from 18 countries, took issue with several of the Center’s findings and crafted an Open Letter to share their concerns.  

What’s clear in all of this is that the “brain fitness” movement, in due time, may well resemble the thriving physical fitness movement.  Said Alvaro Fernandez, in a piece for the Huffington Post: “It took decades of conflicting research and confusing media coverage to finally spread the idea that daily life activities are far from sufficient to keep us physically fit . . . From those humble beginnings, health club memberships in 2014 amounted to $78+ billion dollars in annual revenues.” This same notion, many believe, will apply to brain fitness in the coming years. 

So what exactly did the Stanford Center report have to say?  And how did the 127 scientists, in their Open Letter respond? Some highlights:

Stanford Center on Longevity

·        “It would be appropriate to conclude . . . that the potential to learn new skills remains intact throughout the life span. However at this point it is not appropriate to conclude that training-induced changes go significantly beyond the learned skills, that they affect broad abilities with real-world relevance, or that they generally promote ‘brain health’.”

·        “These conclusions do not mean that the brain does not remain malleable, even in old age. Any mentally effortful new experience, such as learning a language, acquiring a motor skill, navigating in a new environment, and, yes, playing commercially available computer games, will produce changes in those neural systems that support acquisition of the new skill.”

·        “Some of the initial [research] results are promising and make further research highly desirable. However, at present, these findings do not provide a sound basis for the claims made by commercial companies selling brain games.”

·        “We also need to keep in mind opportunity costs. Time spent playing the games is time not spent reading, socializing, gardening, exercising, or engaging in many other activities that may benefit cognitive and physical health of older adults. Given that the effects of playing the games tend to be task-specific, it may be advisable to train an activity that by itself comes with benefits for everyday life. Another drawback of publicizing computer games as a fix to deteriorating cognitive performance is that it diverts attention and resources from prevention efforts. The promise of a magic bullet detracts from the message that cognitive vigor in old age, to the extent that it can be influenced by the lives we live, reflects the long-term effects of a healthy and active lifestyle.”

The 127 scientists respond:

·        “. . . [A] substantial and growing body of evidence shows that certain cognitive training regimens can significantly improve cognitive function, including in ways that generalize to everyday life.”

·        “Over three decades, researchers have built a huge body of evidence that brain plasticity is a lifelong phenomenon – as you acknowledge. However, the [Stanford Center] statement fails to acknowledge that this evidence was derived from training experiments directly documenting the improvement of sensory, cognitive, motor, and functional performance.”

Leading the Open Letter movement was Dr. Michael Merzenich, a member of both the National Academy of Sciences and the Institute of Medicine. Said Merzenich: “The authors of the Longevity Center statement properly concluded that a large body of work has shown there is plasticity throughout the brain and throughout life. . . . It was rather astounding, then, that this same group failed to notice that we proved that through hundreds of studies showing we can drive positive change in the brain through directed, intensive, computer-guided training. It’s silly that anyone would think that we can make cognitive training that works in labs, but not in people’s homes.”


##

Sunday, August 2, 2015

Geniuses in the office: more pain than pleasure?

Geniuses in the office: more pain than pleasure?

Have you ever worked for a genius?  I have. 

The year was 1979, and though the subsequent six years were emotionally painful (he had this unique ability to make you feel inept at every turn), there were positive results, economically speaking.

Those six years remain fresh in my mind, leading me to wonder, from time to time: if you work for a genius, should you bolt or should you stick?

That was the dominant thought that ran through my mind when I read Walter Issacson’s grueling biography of Steve Jobs. Jobs, of course, was a genius by everyone’s account, changing the course of five (yes, five) industries. Yet nearly every page of Issacson’s text revealed Jobs’ cruel and demonic treatment of colleagues. So I wondered: why did these people stick? 

Last week, I found the answer. 

It appeared on page 332 of Doris Kearns Goodwin’s glorious book “The Bully Pulpit” (which traces the intertwining lives of Presidents Teddy Roosevelt and William Taft).  The dynamic passage – authored by Ida Tarbell, the nation’s leading journalist a century ago – was written about Sam McClure, the nation’s leading publisher. But it could just have easily been written about Jobs, a century later.  Wrote Tarbell, of McClure (in a letter to colleagues):

“Never forget that it was he and nobody else who has created that place. . . He is a very extraordinary creature, you can’t put him into a machine and make him run smoothly with the other wheels and things. . . . . Able methodical people grow on every bush but genius comes once in a generation and if you ever get in its vicinity thank the Lord & stick.  You probably will be laid up now and then in a sanitarium recovering from the effort to follow him but that’s a small matter if you really get into touch finally with that wonderful brain. . . . If there was nothing in all this but the annoyance and uncertainty & confusion – that is, if there were no results – then we might rebel, but there are always results – vital ones. . . . The great schemes, the daring moves in that business have always been [his]. They will continue to be. His one hundredth idea is a stroke of genius. Be on hand to grasp that one hundredth idea.”

Hire or Fire?
So when it comes to genius, do we bolt, or do we stick?  Do we hire or do we fire?

The verdict is unclear.  Enthusiasts maintain that businesses need to actively recruit geniuses, in order to advance the organization. Others, however, insist that geniuses do more harm than good and should be led out to pasture.  Below are two contrasting views.  Take your pick.

Hire ‘Em – Dave Logan, writing for CBS Money Watch
Logan urges companies to hire geniuses, then learn to manage them. Logan acknowledges that often, as bright as geniuses are, they can be incredibly difficult to work with (Logan jokes: “. . . the chance that [the genius] will offend someone in a conservative culture is 100% - in the first week.” Nonetheless, Logan recommends that you pull the trigger, saying: “If the hiring manager knows the tradeoffs, they’ll often do the right thing for everyone by hiring the genius, and then working to minimize the deficits, or clean up messes when they happen.” 

Fire ‘Em – Scott Lowe, independent consultant, in an article for www.techrepublic.com
Says Lowe: “Eventually, when a serious attitude problem exists, it’s more than likely that you’ll need to fire the person for the sake of the rest of the team. . . For my own organization, I hire attitude first, skill second. . . . Look for people who fit our culture and have appropriate skills to do the job. . . . You can teach skills, but teaching attitude is much harder.”

Stick or bolt?  Hire or fire?  It just might take a genius to decide what to do.


##

Sunday, July 26, 2015

How do you describe yourself?

“We’re like onions, and we have layers.” – Shrek

How do you describe yourself?

Are you cautious or care-free? Decisive or indecisive? An introvert or extrovert?

Humans love to label, and labels fall easily from our lips. We label everything in our path – from habits and character traits to style, tempo and drive.

But why? What compels us to do this? Are we not, as human beings, dynamic creatures who are constantly evolving? If so, why stick a label on it?  Leading to this nagging question: do self-imposed labels serve us well?  Or do they limit our ability to change and grow?

In today’s high octane culture, enveloped by intense media scrutiny (of every one, and every thing!), it’s difficult not to classify (e.g., he’s lazy, she’s a math whiz, he’s so forgetful), not just others, but ourselves (I’m not a risk taker, I’m too shy).  But do they serve us well?

I vote no. I maintain that labels, by and large, inhibit our growth and impede our ability to make the most of our lives. They limit us. They restrict us. They make it more difficult for mid-course corrections.

Part of the problem, of course, is the binary nature of self-evaluation. It’s common to describe people – and ourselves – as introvert/extrovert, optimist/pessimist, Type A/Type B, Republican/ Democrat, high maintenance/low maintenance.  Why, pray tell, are there only two categories?

A search of the literature finds some breaks in the binary stranglehold (ambiverts are now a recognized category, and I recently discovered what it means to be a Type C and Type D personality). Even Myers-Briggs, which graciously offers us 16 categories, hinges on a binary framework – introversion/extroversion, sensing/intuition, thinking/feeling and judging/perceiving.

Aren’t humans a little more complex than that?

Which brings us to this brilliant piece written by a young woman who was asked, simply: “How do you classify yourself?” (her answer appeared on the website www.worldinconversation.org):

“I don’t like labels. In my opinion, people are people. Everyone has differences and everyone has similarities. As Shrek says, we’re like onions and we have layers. Why would anyone want to define himself or herself by just one of those layers? Labels have started wars, torn families apart, and caused heartache. I like to think that I’m made up of many things, and that just one thing doesn’t define me. However, for the sake of the prompt, these are what define me:

“I am a single white female. I am German and Hungarian. I am a sister, daughter, niece, granddaughter, great niece, and cousin. In the summer, I’m a child-care provider. The rest of the year, I’m a poor college kid. I’m a musician: I play trumpet, piano, and ukulele. I am a ballet, tap, and swing dancer. I’m a singer, an alto to be specific. I am a connoisseur of Lindor truffles, peach pie, Doritos, and raspberry smoothies. I am a lover of oldies’ music, high-heels, and vintage clothes. I’m an avid celebrity-rag reader. I’m a Jack Benny Program listener. I’m a collector of Smokey Bear paraphernalia, post-cards, Broadway show pins, and Snapple caps. I am a member of the Pennsylvania State Marching Blue band and the a capella group, Blue in the FACE. I am a world explorer.

“I am imperfect. I’m a crybaby. I am a complainer. I am a devil’s advocate. I’m a prep. I am indecisive. I’m a ‘goody-two-shoes.’ I’m a band geek. I’m a Harry Potter junkie. I am enthusiastic and loud when it’s not socially acceptable. I’m a lover of horribly written Meg Cabot novels like The Princess Diaries series. I’m a nervous giggler in inappropriate situations. I’m a too-cautious driver (and it caused me to have my first accident not too long ago)! I’m a grudge-holder when someone hurts my sisters.

“I am both an introvert and an extrovert given the right setting. I am a procrastinator and an over-achiever. I am a winter-lover and summer-baby. I love old people and toddlers. I love kids, but I don’t want kids. I both love and hate Walmart, but mostly hate. I’m am both an individualist and conformist.

“I am a left-wing bleeding-heart liberal. I’m a citizen of the world. I’m an advocate for the ONE Campaign, the campaign to make poverty history. I am pro-choice and anti-guns. I am a believer in the good in people. I am a Lutheran who believes in karma. I’m a registered democrat who has not missed an election yet. I am a strong-willed democrat. I am a fan of the Golden Rule. I’m a dreamer, pacifist, optimist. I’m a lover, not a fighter. I am nothing less than all of this. I am me.”

##


Saturday, July 18, 2015

At what age will your mental abilities peak?


At what age will your mental abilities peak?

Well, it’s not 24. 

The notion has persisted for generations – that the human brain’s cognitive abilities peak in the early 20s and then begin a slow march downhill. 

Look around. Think about musicians, salesmen, actors, lawyers, engineers, painters, directors, sculptors, psychologists and novelists. When do their skills peak? To what degree does experience factor into the equation?

A new study out of New England – focused exclusively on cognitive abilities – stands ready to up-end the long-held notions that young is, by definition, better. Study authors Joshua Hartshorne and Laura Germine found that, when it comes to thinking, there’s no magic age.  In fact, some skills (e.g., vocabulary recognition) don’t peak in humans until age 65 or 70.

Here’s a quick-look summary of their findings, drawn from nearly 50,000 online participants (note: Hartshorne is with MIT, Germine is a research associate in Harvard’s Psychology Dept. and a postdoctoral fellow at Harvard-affiliated MGH):
·        Ability to recognize and remember faces – this ability peaks between ages 30 and 34;
·        Mental processing speed – as one might expect, this skill peaks around age 18 or 19;
·        Social cognition (the ability to detect other people’s emotions) – peaks in the 40s to age 50, with no notable decline until after age 60; 
·        Short-term memory – peaks around age 25, levels off for several years, then begins to drop at age 35;
·        Crystalized intelligence (measured as vocabulary skills) rises as one ages, not peaking until about age 65 to 70.

Randy Dotinga, journalist and President of the American Society of Journalists and Authors, identified some weaknesses in the study. Writing for healthday.com, Dotinga noted that it’s not a longitudinal study but instead is based on a single point in time. Dotinga also pointed out that the study only included people who are Internet-savvy (although, the author acknowledges, the researchers did analyze statistics from studies that were not online). 

Nonetheless, the fundamental message is unassailable: human brains are not simple machines which, as they age, begin to deteriorate. Instead, brain plasticity is at work, throughout the life span. Noted a study summary at www.psychologicalscience.org: “It’s not yet clear why these skills tend to peak at different ages, but previous research suggests that it may have to do with changes in gene expression or brain structure as we age.”

When does creativity peak?
Dean Simonton, psychologist and UC Davis professor, has studied the phenomenon of creativity for nearly 40 years, and he explains that “research has consistently found that creativity is a curvilinear (inverted backward) function of age – meaning that older individuals would not be creative. However, the empirical and theoretical literature shows that such a pessimistic conclusion is unjustified. Numerous factors operate that help maintain creative output throughout the life span. Indeed, it is actually possible for creators to display a qualitative and quantitative resurgence of creativity in their final years.” Simonton goes on to note that a range of professions – among them poets and painters – have their most productive and prolific years well past what we commonly call “middle age.”

So, when will your mental abilities peak? Well, it’s certainly not 24. 


##

Sunday, July 5, 2015

When you praise someone (or yourself), are you doing it right?

When you praise someone (or yourself), are you doing it right?

Let’s start with some key research findings on the delicate art of praise:

·        Overly positive praise can backfire, leading children (particularly those with low self-esteem) to back away from future challenges;

·        Given the choice, use process praise (“You did a wonderful job”) instead of person praise (“You’re so smart”).  And here’s why, according to an article written by the Society for Research in Child Development (SRCD): “. . . [P]rocess praise sends the message that effort and actions are the sources of success, leading children to believe they can improve their performance through hard work. Person praise sends the opposite message—that the child’s ability is fixed.”

·        When praising a child, it’s important to avoid the word “incredible”;

·        Parents deliver more process praise to boys than girls; and

·        Inappropriate self-praise can have negative effects.

In study after study, the overriding message is clear: honest, realistic praise (whether given to others, or oneself) is desirable.  So choose your words, and your internal thoughts, carefully. 

Process Praise vs. Person Praise
What’s the difference?  Said SCRD, in their article posted at www.psypost.org: “. . . [W]hen parents praise the effort children make, it leads children to be more persistent and perform better on challenging tasks, while person praise (praising the individual) leads children to be less persistent and perform worse on such tasks.”

In one longitudinal study, led by Assistant Professor of Psychology Elizabeth Gunderson (then with the University of Chicago), researchers examined the relationship between praise and challenge-seeking, in toddlers ages one to three years old.  They found that children who were praised for their effort (as opposed to praised as individuals) had a more positive approach to challenges just five years later. Said Gunderson, quoted in a psypost.org article: “This study suggests that improving the quality of parents’ praise in the toddler years may help children develop the belief that people can change and that challenging tasks provide opportunities to learn.”

Avoid Inflated Praise (and the word “incredible”)
What constitutes inflated praise?  Often it’s the word “incredible” (e.g., Inflated praise: “You made an incredibly beautiful drawing!” Non-inflated praise: “You made a beautiful drawing!”).

Said Utretcht University psychologist Eddie Brummelman, as quoted in an article at www.psychologicalscience.org: “Inflated praise, although well-intended, may cause children with low self-esteem to avoid crucial learning experiences.”  The article continued: “Specifically, the researchers write, rave reviews for a mundane accomplishment can convey an unintended message: Now that you’ve excelled, we’re going to hold you to a very high standard. Since youngsters with low self-esteem are driven by a desire to avoid failure, this can prompt them to avoid challenges.”

Girls vs. Boys
Gunderson’s longitudinal study (cited earlier) found that boys and girls receive the same amount of praise overall, but that boys receive “significantly” more process praise than girls. Not surprisingly, said the researchers, “boys were more likely to have positive attitudes about academic challenges than girls and to believe that intelligence could be improved,” according to the SRCD article. The article quoted Gunderson, who said: “These results are cause for concern because they suggest that parents may be inadvertently creating the mindset among girls that traits are fixed, leading to decreased motivation and persistence in the face of challenges and setbacks.”

Praising Yourself
Research led by Young-Hoon Kim, PhD, of the University of Pennsylvania, found that it’s important for adults to accurately assess their performance, and that falsely boosting their self-esteem can have unintended negative consequences.

According to a press release from the American Psychological Association: “People who try to boost their self-esteem by telling themselves they’ve done a great job, when they haven’t, could end up feeling dejected instead.” Said lead author Kim, as quoted in the APA release: “These findings challenge the popular notion that self-enhancement and providing positive performance feedback to low performers is beneficial to emotional health. Instead, our results underscore the emotional benefits of accurate self-assessments and performance feedback.”

Added co-author Chi-Yue Chiu, of Nanyang Technological University in Singapore: “Distress following excessive self-praise is likely to occur when a person's inadequacy is exposed, and because inaccurate self-assessments can prevent self-improvement.” The study involved young people from both the U.S. and Hong Kong.


##

Sunday, June 28, 2015

Does the moon influence the human body? (next full moon? July 2)

Does the moon influence the human body?

The question has stirred for ages, and a group of scientists in Switzerland recently reported that they’ve found evidence (they called it “statistically significant”) that human sleep suffers during a full moon. The study, though well controlled, was small (just 33 individuals) and skeptics were quick to challenge their claims (Fred Turek, a chronobiologist at Northwestern University, told NPR: "Extraordinary claims require extraordinary evidence" and the new study, said Turek, falls far short of providing that evidence).

Nonetheless, the scientists’ claims were noteworthy. Said the study report: “We found that around the full moon, EEG delta activity during NREM sleep, an indicator of deep sleep, decreased by 30%, time to fall asleep increased by 5 minutes, and EEG-assessed total sleep duration was reduced by 20 minutes.” So less total sleep, diminished deep sleep and lower melatonin levels. 

The researchers, led by Christian Cajochen, who studies circadian rhythms and sleep at the University of Basel, added: “. . . to our knowledge, this is the first report of a lunar influence on objective sleep parameters such as EEG activity during NREM sleep and a hormonal marker of the circadian timing system (melatonin) in humans . . . .”  They went on: “This is the first reliable evidence that a lunar rhythm can modulate sleep structure in humans when measured under the highly controlled conditions of a circadian laboratory study protocol without time cues.”  In other words, the subjects were in lab rooms for days on end, without any cues from natural light.

Author Niall McCrae, in a piece written for www.theconversation.com, noted: “. . . [T]he results suggest that humans might have an innate circalunar rhythm, that is, a body clock of physiological activity with a length that roughly correlates to the length of the lunar cycle (29.5 days).” Quoting the study, McCrae said that “at full moon, the peak in melatonin levels was delayed by around 50 minutes.” Concluded McCrae, author of a book on the moon and its influence on mental illness: “Christian Cajochen and fellow chronobiologists [have] provided perhaps the strongest indication yet that the moon really does affect the mind.”

Do animals have a circalunar clock?  Cajochen and colleagues cited recent research which found such a clock in a marine midge. “This circalunar clock is thought to tick inside many animals, running in synchrony with the tides and working in conjunction with the animals’ circadian clock.” The researchers cited a study by Wilelski and Hau which “found that those Galapagos marine iguanas with the most accurate circalunar clock were most likely to survive tough times, presumably because they were the best at reaching feeding spots first . . . . ”

So, while Cajochen and fellow chronobiologists keep looking for answers (said the report: “It remains challenging to unravel the neuronal underpinnings of such a putative lunar clock in humans”), you might consider going to bed a touch earlier this Thursday, July 2. After all, it’s a full moon.


##

Sunday, June 7, 2015

Becoming an expert: is 10,000 hours still the mark?

Becoming an expert: is 10,000 hours still the mark?  

The notion has been circulating for decades: to become an expert, you need to practice for 10,000 hours (and in case you’re wondering just how long that is - and I was – if you practiced two hours a day for nearly 14 years, that would equal 10,000 hours). 

But the notion, like so many, has come under fire, leading to these poignant questions: does this theory apply equally to skills which are physical (e.g., tennis, violin), cognitive (e.g., chess) and social? And how do you account for the fact that some athletes, and chess players, become proficient in far less than 10,000 hours?  Further, is it any type of practice? Or something more specific?

The most recent challenge was logged by a group of psychologists from five universities (Michigan State, Rice, Southern Illinois, the University of Liverpool and Edith Cowman University in Australia). They rebuffed the 10,000-hour claim made popular by Malcolm Gladwell in his 1993 book Outliers. Said the researchers, as quoted by Shaunacy Ferro in a piece for fastcodesign.com:

"The evidence is quite clear that some people do reach an elite level of performance without copious practice, while other people fail to do so despite copious practice."

Both Gladwell and Andrew Ericsson (the Florida State University psychologist whose work generated the 10,000-hour finding) have taken issue with their critiques – Gladwell maintains that the rule applies to “cognitively demanding tasks” (e.g., chess, violin) “and not physical attributes,” according to a Forbes.com article. And Ericsson argues that “his critics had examined too many beginners rather than expert performers,” according to Ferro’s piece.

Practice, these critiques maintain, can only explain one-third of the variation in performance, according to Ferro, who concludes: “. . . practice alone won't make you Yo Yo Ma. It could also have to do with personality, the age you started, intelligence, or something else entirely.”

So what does it take, to become an expert?

Apparently, it’s something called “deliberate practice,” which its author maintains is neither work nor play: 

“Deliberate practice is . . . not just business as usual. . . . Deliberate practice is not work and it is not play. Those activities are important, but they don’t count toward your 10,000 hours. 

Work is where we exercise the skills we already have. . . . [The] performance improvement from time spent at work is minimal compared to time spent in deliberate practice. That fact that you’ve managed a team for 10 years doesn’t automatically make you a world-class manager. Work isn’t deliberate practice.”

So what is deliberate practice? According to expertenough.com: "Deliberate practice is a highly structured activity engaged in with the specific goal of improving performance. Deliberate practice is different from work, play and simple repetition of a task. It requires effort, it has no monetary reward, and it is not inherently enjoyable. When you engage in deliberate practice, improving your performance over time is your goal and motivation."

Daniel Goleman, in his book Focus: The Hidden Driver of Excellence, quoted Ericsson as saying:

“You don’t get benefits from mechanical repetition, but by adjusting your execution over and over to get closer to your goal.”

So, the question is, in your bid to become more proficient (say, at tennis, golf, or as a performer), how much “deliberate practice” are you putting in?  In other words, is your practice focused on improving?  Or is it more repetitious in nature? 

Think now, for a moment, about the world-class tennis athletes who took the world stage the last two weeks at the French Open in Paris. How many hours have they put in? And how many of those, do you think, involved deliberate practice?

The bottom line: it now seems a touch obvious that proficiency, in any realm, is more a matter of concentration and focus than mere hours.  In his book, Goleman noted:

“After about 50 hours of training – whether in skiing or driving – people get to that ‘good-enough’ performance level, where they can go through the motions more or less effortlessly. They no longer feel the need for concentrated practice, but are content to coast on what they’ve learned. No matter how much more they practice in this bottom-up mode, their improvement will be negligible.”

Or, perhaps, expertise is connected to love. Here’s what wisdomgroup.com had to say:

“The elite don’t just work harder than everybody else. At some point the elites fall in love with practice to the point where they want to do little else.”

##