Tuesday, April 7, 2009

Hitting a Moving Target—What is Normal?


“We have an epidemic of attention deficit disorder—or at least, we have an epidemic of diagnoses of that condition. And the culprit most often named? The use of computers,” wrote Hugo and Nebula Award winning author Robert J. Sawyer in the March 20 (2009) issue of the Ottawa Citizen entitled, All Screens Are Not Equal. “But is there really something wrong with huge numbers of young people today?” he goes on to pose. “Has computer use rotted their brains?”

“Or is it—perhaps—that there’s something wrong with how we’re defining normal?” taunts Sawyer.

He goes on to say that our psychological tests for measuring attention were developed between the 1950s and the 1990s, during a time that the kind of screen that dominated society was the passive screen of TV and the phenomenon of the “couch potato”. Today’s dominant screen is the computer screen, whose interactivity has very little in common with the passive screen of TV. In fact, asserts Sawyer, the computer is “far from being harmful.” Rather, it is “a key to survival in the new world.”

Psychologists have demonstrated that playing video games, for instance, exercise the mind and improve memory and alertness. Richard Alleyne, Science Correspondent with the Telegraph, reported that such play also reverses cognitive decline, “making the brain more agile” and improving an elderly person’s ability to juggle multiple tasks. So much so that a growing market of computer games is being marketed for the elderly (e.g., Nintendo DS portable games). The December 11 2008 issue of the Washington Post reported that past studies “have shown that playing video games [have] many positive benefits, ranging from improved problem-solving abilities in young people to improved operating skills in surgeons.”

The December 2008 issue of Psychology and Aging reports findings of researchers Chandramallika Basak, Art Kramer and other researchers who tested the cognitive abilities in people in their 60s and 70s before and after playing the real-time strategy video game “Rise of Nations”, which rewards the complex task of creating a society (which includes building cities, employing people and expanding territory). You guessed it; those who played the game displayed enhanced cognitive abilities, memory, and alertness. “When we look at improvements in cognition,” says Basak, “it’s not just one thing that it is affecting, it is all integrative.” The benefits remained for weeks after and transferred to everyday tasks: things like “scheduling, planning, working memory, multi-tasking, and dealing with ambiguity,” which, according to Kramer, are harder for older people to do.

Sawyer was referring to a changing—evolving—paradigm in society. It is an evolution that embraces technology as part of who and what we are and will be. And that scares people. Change usually scares people. It takes us out of our “comfort zones” and strips us of our carefully constructed masks. It puts us out there, where that part of us that neatly hides beneath our conformity is exposed: the part of us that is different, unique, and in many cases, unrealized. Change liberates our creative souls; it gives us a chance to be and express who we really are.

I found the commentary by online readers on Sawyer’s article fascinating. Commentary was a mix of enthusiastic agreement, cynical skepticism and barely masked fear. The major concerns were that playing video games and computer use are isolating and mitigate other health benefits provided by their use. I found one comment particularly interesting that responded to Sawyer’s assertion that kids are “creating videos, building websites, and publishing thousands of words of text each day…doing high-quality work…doing things on their own that it used to take large teams to accomplish.” The comment suggested that this was a bad thing, because it was ultimately anti-social and that scientific studies had proven that multi-tasking “actually harms the brain’s ability to learn.”

Is it not a bit hypocritical to point this “moral” finger at kid’s creative use of computers when we as adults (many of us at any rate) spend our work time AND play time in front of computers ourselves? Basak suggests that playing strategy-based games such as chess or online interactive video games with other people provides the same benefits without sacrificing social interaction.

And, while several studies have shown that workers required to multi-task in the workplace can be less productive than those with fewer tasks to perform, other studies have shown that the ability to multi-task can be highly beneficial. For instance, a recent study by Kleider, Parrott and King of Georgia State University demonstrated that multi-tasking policemen were less likely to shoot an innocent person. “In cognitive psychology, operation span—or working memory—is an overarching cognitive mechanism that indicates the ability to multi-task…People with a higher capacity are able to keep more things ‘in play’ at one time,” said Kleider.

I think it quite apt that Sawyer’s article in the Ottawa Citizen precedes his book tour for the first book of his new series on the World Wide Web. It’s entitled “Wake”.

So, what IS normal?


Recommended Reading:
Basak, C., W.R. Boot, M. Voss, and A. Kramer. 2008. Can training in a real-time strategy videogame attenuate cognitive decline in older adults? Psychology and Aging 23: 765-777.

Boot, Walter, Arthur Kramer, Daniel J. Simons, Monica Fabiani, and Gabriele Gratton. 2008. The effects of video game playing on attention, memory, and executive control. Acta Psychologica 129: 387-398.


De Lisi, R. & J.L. Wolford. 2002. Improving children’s mental rotation accuracy with computer game playing. The Journal of Genetic Psychology 163:272-282.

Green, C.S. & D. Bavelier. 2003. Action video game modifies visual selective attention. Nature 423:534-537.

Kleider, H., D. Parrott, & T. King. 2009. Shooting Behavior: How Working Memory and Negative Emotionality Influence Police Officer Shoot Decisions. Applied Cognitive Psychology (in press).

Kramer, A.F., & S.L. Willis. 2002. Enhancing the cognitive vitality of older adults. Current Directions in Psychological Science 11: 173-177.





Nina Munteanu is an ecologist and internationally published author of novels, short stories and essays. She coaches writers and teaches writing at George Brown College and the University of Toronto. For more about Nina’s coaching & workshops visit www.ninamunteanu.me. Visit www.ninamunteanu.ca for more about her writing.

15 comments:

Åka said...

What is normal? I think that in order to answer that in any meaningful way you need to see human nature in the context it evolved. We adapt to a lot, but I think we need to understand where we come from to build society and technology that can help us make the most of our inborn strengths.

Nina Munteanu said...

Good point, Aka. Context is the magic word here. And attitude (borne of a secure place). If we approach change with insecurity or fear we tend to color its potential with negativity. If we approach it with confidence, we are more positive.
When we choose to define a new trope or idea by its negative properties, we doom it to failure at the outset. This is the hallmark of a closed mind and a fearful lmind, borne out from lack of self-esteem and insecurity. What results is a lack of faith in humanity's ability to do the right thing and a general disrespect for other's moral decisions. Though I confess to the odd bout of blind optimism, I am not advocating blind acceptance...Rabbits don't hang out with foxes...yet... A path midway -- the path of optimistic realism

Jean-Luc Picard said...

Normal? There is no such thing. What WE perceive always seems normal...even a psychopath will think his behavior normal.

Nina Munteanu said...

YES! Excellent point, Jean-Luc. Well, then... why be normal?...

Modern Matriarch said...

Ooops my original comment has vanished into cyberspace. Well, anyway, I just wanted to say hi! and tell you that I appreciate your talent. Thanks for sharing your perspective which makes me feel a little less abnormal.

Nina Munteanu said...

Hey, Tricia! Great to see you here! Thanks for the support. I figure that "normal" is so cliche... and overrated... :)

Anonymous said...

Why be Normal....

When the going gets weird, the weird turn pro!

- Limberger

Nina Munteanu said...

LOL! I love that, Limberger! Coming from an expert such as yourself, I take that as... well... as ... :D

"Let there be weird!"--Nina Munteanu
**************
:D

Heather Dugan ("Footsteps") said...

"Normal" IS the moving, ever evolving target. People either aim for it or flatly target a contrary direction.

On the multi-tasking... Motherhood makes it a necessary skill for many of us. The difficulty lies in finding the elusive "off" switch. It becomes exhausting and less effective over time. I wonder how many others have trouble NOT multi-tasking when the situation doesn't merit multiple focuses.

Baby Brie said...

Just so you know...psychological testing is only a part of diagnostic evaluation for ADHD. A summary of diagnostic evaluation for ADHD might look as follows:
•Physical Exam - Office Visit
•Clinical Interview - Parents (45-60 minutes)
•Clinical Interview - Child (45-60 minutes)
•TOVA test
•Parent and Teacher Rating Scales
•Office visit to review information and develop a treatment plan
•Other testing if there were still questions to be answered

Some clinicians find the TOVA test or Test of Variables of Attention to be the most helpful for the diagnosis and treatment of ADHD but always in combination with the interviews, rating scales, physical exam and perhaps other testing.
The TOVA is actually (surprise, surprise) a computer test (supposedly extremely boring) that requires the kids to respond to a target stimulus by pressing a button, or to not respond when there's a non-target stimulus. The fact that it is so boring helps to differentiate between kids who have trouble with "boring," and kids who do all right with "boring."
The reason that there might be a rise in the number of children diagnosed with ADD/ADHD might be because in the past children were often misdiagnosed with a learning disability or a depressed IQ, when in fact, they had ADD/ADHD. I would suggest that the rise in the number of children diagnosed with ADD/ADHD may be a result of our ability to differentiate/discern ‘disabilities’ that in the past may have been all lumped together. And now that we are able to discern the differences, we are now able to treat the conditions accordingly.
Computers have helped make the statistical analysis of the reliability and validity of psychological tests easier, and as a result we may be a little more comfortable in accepting the predictive ability of such tests. The TOVA appears to have been first developed in the 1990’s. However, subsequent versions have been developed in response to changing times and further critical analysis.
For ‘abnormal’, check the DSM Diagnostic and Statistical Manual for Mental Disorder. If you don’t have the required number of identifiable symptoms for a given disorder, consider yourself normal. :-)

Nina Munteanu said...

You raise an interesting point, Heather... What do multi-taskers do when they aren't required to multi-task?... languish? ... turn into country & western singers?

Thanks for the info, Baby Brie! Yes, I think the rise is partially a definition/perception phenomenon... Not unlike in my field of the environmental sciences, where now what we can detect very low levels of some chemicals... suddenly they are EVERYWHERE!!!

...as for normal... I still think it's very overrated... :D

To seek "normality" is in some ways to be a conformist and be "similar" to the rest... where's the fun in that? Where's the creativity? Originality? Absurdity? Wonder? Excitement?

For me, as an artist, to be "normal" is to negate some of what is inside me that wants to express itself.

Even in science, we must watch what we mean by "normal". This word requires judgment with many caveats attached to it. What are we really doing when we "normalize" data, as we often do in science... What does the data mean then?

Nina Munteanu said...

LOL! I just reread what I'd written about being normal... and it made me laugh... Well, why be normal?...

The bottom line is that to gain perspective and make discovery in anything you cannot look at it directly, you need to look at it askance, at an angle. This is the same with what generally constitutes "normal" to us

By its very nature "normal" within a culture and zeitgeist is the "status quo", the familiar, the common and comfortable, the accepted and often-implied the "static" (and often prosaic), the EXPECTED...

muzuzuzus said...

I do not believe in the phony diagnosis of 'ADHD' and see it rather as the wake up call to all adults that the Big Drug corporation is now targeting our very children.

I am NOT saying that some children aren't 'difficult'--I am saying that calling it a disease/disorder is like some kind of 'magic wand' that allows people to carry on the toxic cultural doings as per usual without having to examine those.

I highly recommend this book by Neurologist, Dr Fred Baughman
I am sorry about the long webaddress, but you can actually read most of it online if you want:
The ADHD Fraud: How Psychiatry Makes Patients "Of Normal Children"
http://books.google.co.uk/books?id=3R4XCP1Dwi8C&dq=dr+fred+baughman+the+ADHD+fraud&printsec=frontcover&source=bn&hl=en&ei=GVE2SunMMIXLjAehsaChCg&sa=X&oi=book_result&ct=result&resnum=6#v=onepage&q=dr%20fred%20baughman%20the%20ADHD%20fraud&f=false

muzuzuzus said...

And here's my views on many of the violent video games etc http://intothefaerywoods.blogspot.com/2009/10/brainstorming-state-were-in.html

Nina Munteanu said...

Muzuzuzus, thanks for your thoughtful comments and links for my readers...

I can't agree with you on ADHD being a "phony" condition in humans. I am a scientist (the evidence is overwhelming), I have discussed this condition with my colleagues in the psychology field, and I personally know people diagnosed with this condition.

I do, however, agree with you that mis-diagnoses of all kinds have been used in some circles to avoid responsibilities, like giving their time to high-energy children and respecting them for what they are--just take a pill and all will be fine. But for who?

In a previous post of mine, entitled, "Cosmetic Neurology--the cost of Cognition Enhancement" I discuss how students and business people without this physiological condition are using its drugs, like Ritalin, as study aids and cognition enhancers, with hidden costs because they do not have the ADHD physiology...Here's the link: http://sfgirl-thealiennextdoor.blogspot.com/2009/09/cosmetic-neurologythe-cost-of-cognition.html


Humanity will continue to find excuses for abrogating personal responsibility... not unlike the post of yours that I commented on, where I referred to the sad case of no one helping a man bleeding to death in a crowd, because they thought someone else would do it. And, yes, he bled to death.

That's sad.