Going to extremes with visual literacy: good learning principles and (gulp!) video games

One of the best PD opportunities I have had this year was an audience with Paul Gee, a presenter at the Learning and the Brain Conference held in San Francisco.

He challenged his audience of teachers with the notion that education should be a whole lot more like video gaming. This seemed like a pretty radical perspective coming from an old professor of literacy and linguistics. As I recalled, his earlier work (which I came across repeatedly when I undertook my M.Ed nearly a decade ago) had focused (much more sedately) on language and literacy education. However, as I looked over his bio, I could see the shift in focus reflected in book titles like: What Video Games Have to Teach us About Learning and Literacy (2007, 2nd Edition) and Good Video Games and Good Learning: Collected Essays on Video Games, Learning, and Literacy (2007). Clearly, he had found that language and literacy can be exponentially enhanced through the medium of gaming  – a marriage of entertainment, learning, and mastery.

Nevertheless, this was a pretty revolutionary – even confronting – way to start a keynote address to an auditorium of educators. Of course, clever chap that he is, he didn’t just come out with this startling statement. Instead, he worked the audience up to it with some comments of such striking logic that we didn’t even realize where we were heading until it was all so self-evident, it was hard to disagree.

Opening comments were about changes in education and the learning sciences in recent times. We used to think humans were predictable, logical beings, good at rule-following, etc. But newer theories describe the human brain as hardly comparable to a calculator and not particularly good at abstraction. Rather, our minds are good at assimilating experience – we learn best from experience.

Now, there is one big problem with this predilection: the world is not a very good teacher; we can get killed from life experience. So, if educators want to harness the power of experience for teaching and learning, then the experiences for learning need to be well-designed. We can’t just turn kids loose to learn. Teachers need to purposefully design learning experiences.

What design-elements do these learning experiences need to include? Gee provided the following list:

  • Motivation (Emotion)
    Clear goals
    Reflection IN action (Did that work? Why or why not?)
    Immediate and copious feedback
    Attention (Clarity about what to focus your attention on)
    Perspective (Inside and outside the experience  – what is the big picture as well as the individual experience)
    Practice at applying to new experiences
    Debriefing (Reflection ON Action) … reflecting and sharing explanations with other people

Enter: Video Games!

Now that Gee had me nodding and agreeing with him, I was a lot more open to his hypothesis that video games are promising for learning because they are an experience that is completely guided and controlled by the creator. In addition, video games can be understood as operating on two levels – the software is basically a well-designed experience, and the meta-game is the well-designed social interaction that takes place around the game. (People play video games, like World of Warcraft, and then get involved in a community to discuss and reflect on the game together.)

Video games = well designed experiences + well designed social interactions

Good teaching = well designed experiences + well designed social interactions

But Gee wasn’t content at having made this connection. He switched focus at this point to language and literacy ( his old stomping ground) and the point that school is designed around language – specialist, academic language. However, when kids fail with language, he asserted, they don’t usually fail at phonetic reading. Actually, they fail because they are bored by it.

Kids learn a language by having experiences with this language. You can’t learn language with other language, though. Right?

To illustrate this, imagine yourself coming home from the store with a brand new video game. (Or better yet, imagine a kid.) What happens next? Do you – or that child – rip open the wrapping to then sit down on the couch with copy of the instructions to pore over?

If you’ve ever tried to read the instructions of a sophisticated videogame, you would know that the instructions are virtually indecipherable. You have to actually play the game to get what’s going on. Then, after experiencing the game, you would have a lot greater chance of making sense of those instructions. When we can fill in images, actions, experiences, and dialogue, then we have a “situated meaning” for a piece of language. So now it’s not hard anymore; it’s obvious!

AttributionNoncommercialShare  Alike Some rights reserved by Alfred Hermida

Can teaching be like this? If we were to give our students the game experience equivalent of the biology concepts we want them to learn – instead of giving them the biology textbooks to read – Gee asserts that we wouldn’t be getting that bell curve anymore. And we could happily screw up the national testing system!

 

AttributionNoncommercialShare  Alike Some rights reserved by Colin Purrington

Capitalist companies in the video gaming industry have figured out the incredible power of this “situated meaning.” Take Yu-Gi-Oh!  for example.

Yu-Gi-Oh! is a Japanese manga franchise that includes a trading card game and numerous video games. Most versions of Yu-Gi-Oh! involve a fictional trading card game , where each player uses cards to “duel” each other in a battle of fantasy “monsters.” From an educator’s perspective, what’s interesting about all this is just how complex the language of these cards can be … and just how addictive the game is, nevertheless. We face significant difficulty in motivating our students in schools to take on long and challenging tasks; yet, many video games are long and hard … and kids love them!

It is this intense engagement in self-directed learning that Gee wants educators to harness. Once again, he highlighted the features that video games boast that make them ideal for learning.

  • Motivation: Oodles of it
  • Problem-solving: Research shows that if you teach facts, and test like we do today, you will get facts and formulas learned but it doesn’t correlate to problem-solving; but if you teach through problems, the person has to recruit the facts to solve the problem so get both the facts AND the problem-solving
  • Clear goals: e.g. to reach the next level
  • Copious feedback: Running scores and updates
  • Well-designed experience: Completely constructed – no real-life randomness!
  • Mentoring IN the game and in the META game (and the transfer from video game experience to real world experience  comes through the community interaction)
  • Performance before competence: Educators don’t tell students: Keep your mouth shut until you learn English … and yet we do often skip straight to the expectation of mastery, without the benefit of practice and experience, when we try to teach other concepts
  • Failure: Failure-based learning: If the cost of failure is high, you won’t continue; however, in video games, the cost of failure is made lower because you want to explore everything thoroughly because you might want to rethink your goals from time to time, try out some new styles, take risks
  • Well-ordered expertise: Learners continuously face problems they need to solve; the problems are scaffolded and ordered so as to keep players/learners right at their Zone of Proximal Development
  • Cycle of expertise: Give the player/learner a problem, (challenging, but do-able); then they practice the problem until they can do it in their sleep; then you provide the next problem, and the automatic expertise needs to be undone / challenged again

Gee made another rather “A-ha!” connection for me between education and video games – or rather assessment and video games. They make you realize how unnecessary assessment is to the well-designed learning experience (i.e. video game), because they:

  • Integrate learning and assessment
  • Provide copious information
  • Include multiple variables
  • Track growth and trajectories across time
  • Provide preparation for future learning
  • They are formative = evaluative

Think about how you can’t get out of a video game level until you have mastered that level – and then think about how ridiculous it would be to tell the gamer who has finally completed the last level: “Okay, you managed to master that level … but now you need to take a test on it!”

Despite my first impressions that his views had grown quite extreme, Gee, in his article elaborating on the ideas from his presentation, doesn’t consider his stance as either conservative or liberal, traditional or revolutionary. He says:

The progressives are right in that situated embodied experience is crucial. The traditionalist are right that learners cannot be left to their own devices, they need smart tools and, most importantly, they need good designers who guide and scaffold their learning . For games, these designers are brilliant game designers like Warren Spector [best known for the cyberpunk video games System Shock and Deus Ex]and Will Wright [original designer for The Sims games series which, as of 2009, was the best-selling PC game in history]. For schools, these designers are teachers.

So, like a good video game designer, teachers like me need to see ourselves as practical theoreticians of learning. The profit motive has prompted people in the interactive video gaming industry to produce incredibly challenging, incredibly motivating learning experiences. What will motivate teachers to do the same? Or policy-makers and governments to measure and provide the cost?

The Costs of Constant Connection

A Newsweek article from August this year – “Lost in Electronica, The costs of ‘the chaos of constant connection’” – gave me pause. It asks us to re-think boredom – usually considered a bad thing – as a special privilege of the complex brain, and thus an aspect that distinguishes us from our animal friends. George Will posits that our capacity for boredom is essential to our humanity because it means we have the mental ability, and the space, to reflect and plan, space for empathetic thinking and community-action. A vital ingredient of global citizenship. And one that is increasingly absent in our constantly connected, constantly stimulated society.

Taking a cue from the evolutionist argument, Will suggests that our efforts to skirt boredom can be explained as the response of a brain formed in dangerous prehistoric times and wired to be constantly alert and vigilant. Our modern lifestyle has brought with it an unprecedented climate of safety and, thus, sedentary brain function, but our minds still seek stimulation and find respite from boredom in audio-visual entertainments. New technologies now allow young people’s brains to remain in an almost constant state of being “switched on” so that boredom can be utterly assuaged.

Will is actually reflecting on an article, “The case for boredom,” by clinical psychologist, Adam J. Cox, author of Boys of Few Words: Raising Our Sons to Communicate and Connect (Guilford, 2006) and No Mind Left Behind (Perigee, 2007). Cox describes the minds of today’s teenagers as obese with the sudden abundance of electronic stimuli. Just as human beings have gorged on the far greater quantities of salt, sugar, and fat that the modern diet allows in contrast to our genome’s formative times, our modern minds now crave “junk nourishment” in the “ubiquitous barrage of battery-powered stimuli delivered by phones, computers, and games” which together form an “addictive electronic narcotic.”

As with all things electronic, the last half century has seen an exponential increase of this phenomenon. Previously boredom might have troubled the brain after an hour or two of nothing much to do. But these days, kids feel bored far faster. As constant stimulation and amusement becomes the “new normal” boredom is disappearing, and with it, by definition, the “available resources for thought, reflection, and civil behavior.” With “excess amusement” young boys (for they were the focus of Cox’s clinical work) are induced into a “pleasant trance from which they do not care to be awakened” and from which they “fail to launch” from self-centered adolescence into the adult world. Cox reminds us that being a responsible and contributing citizen is rarely fun – “it requires patience, forethought, and some willingness to tolerate tedium.” Thus, less boredom is less opportunity to practice civility and civic-mindedness.

The ominous corollary is that our constantly connected young people are stuck in an “electronic playground” of hyperstimulation. In fact, with the diagnoses of learning and attention deficit disorders going through the roof, Cox argues that at some point we just have to drop the label “disorder” and call it, simply, the way we really are right now. And the way we are is amused into a sort of self-absorbed oblivion.

Concerns about how our society is “dumbing down” are not new. Bradbury, of course, was on to it in the 1950s. Neil Postman was still on about it in the 1980s (in his book Amusing Ourselves to Death), arguing that politics, religion – rational argument and the quality of information – had all been diluted and made subservient to entertainment. He writes that consumers have basically, and voluntarily, surrendered their rights in exchange for entertainment.

While Postman pointed the finger at television, Cox points out that the available electronic stimuli have multiplied, with dire consequences for the next generation: “Unlike reading and listening to stories, the blitz of electronica doesn’t build deeper listening skills or a greater range of emotional expression.” He says, “Not only does withdrawal into electronica enable them to bypass the confusion and pain of trying to give their emotions some coherence, it also helps them avoid the realities of being a flawed, vulnerable, ordinary human being.”

Will adds further concern to this social phenomenon of stunted maturity with findings from the field of neuroscience. Brain scientists have shown that the “mature” brain is not a finished product and can, in fact, be rewired by intense and prolonged experiences. “Some research suggests that the constant short-term stimulation of flitting to and fro among digital promptings can impede long-term memory on which important forms of intelligence depend.”

Thus, Will expands Cox’s concerns to embrace us all. He argues that it is not just young boys and not even just the next generation who have all too successfully beaten away boredom. “Adults of both sexes, too, seem insatiably hungry for handheld devices that deliver limitless distractions.”

“We are in the midst of a sudden and vast social experiment involving myriad new means of keeping boredom at bay. And we may yet rue the day we surrendered to the insistent urge to do so.”

So, what do we make of Cox and Will, Postman and Bradbury? Are these just old guys on the un-cool side of the generation gap? Or have they got a (frightfully) good handle on the nature of the next generation?

Perhaps we could simply put it all down to that digital divide between natives and immigrants (to use Mark Prensky’s terminology to distinguish between people born before and after the advent of the computer). Digital immigrants stand out for their predigital dispositions; Prensky calls these digital “accents” in a blog for Edutopia (“Adopt and Adapt: Shaping Tech for the Classroom”). Having come to technology later in life, they tend to downplay the importance of new tools, concepts, and practices made possible in the new digital world, such as the importance of online relationships as compared to face-to-face ones. He states unequivocally, “Such outmoded perspectives are serious barriers to our students’ 21st-century progress.” Prensky positions himself squarely on the side of the natives in his 2006 book, Don’t Bother Me Mom–I’m Learning!, which champions the educational benefits of video game play. He argues that children “are almost certainly learning more positive, useful things for their future from their video and computer games than they learn in school!”

Educationally speaking, therefore, should I be glitz-ing up my teaching with technology? Should I be seeking to engage my students through entertainment? Video games and social networking sites – are these really my best allies in teaching and learning? Should I be making such efforts to go to where the kids are – “hanging out, messing around, geeking out” online – that education looks a whole lot like what kids are already doing with technology? So students don’t even notice when I am slipping in a bit of teaching on the side?
Or is it okay if learning is a bit boring sometimes?

“To be able to delay immediate satisfaction for the sake of future consequences has long been considered an essential achievement of human development.” So sayeth Shoda, Mischel, and Peake in reporting their famous (1990) experiment on delayed gratification in the American Psychological Associations journal, Development Psychology (Predicting Adolescent Cognitive and Self-Regulatory Competencies From Preschool Delay of Gratification: Identifying Diagnostic Conditions). So, should I put that marshmallow (or Oreo cookie in some replications of this experiment) on the desk in front of them? Or is a little bit of delayed gratification (and hard slog) a good thing for my students?

Certainly there seem to be some serious differences of opinion on either side of the digital divide when it comes to information obtained over the internet and concepts of originality – intellectual property and copyright. Some say that many digital age students simply do not understand that using words they did not write is a serious offense. If these digital natives have an “accent” then it is decidedly colloquial. Lazy, even.

On the one side, there is the perspective that the vast buffet of online information is open to the global community and therefore counts, basically, as common knowledge. Cutting and pasting has canceled out the concept of authorship. Digital natives – who have grown up with file-sharing, web-linking, and Wikipedia – assume that the information “out there” is available for anyone to take. This puts them profoundly at odds with educators and older adults who bluntly call this plagiarism.

A New York Times article (“Plagiarism Lines Blur for Students in Digital Age” August 2010) claims that the number of university students who believe that copying from the Web constitutes “serious cheating” is declining. The Western concept of intellectual property rights and singular authorship seems to be on the way out. It’s been with us for a while, anyway – since the Enlightenment, one could argue. So is this simply a natural waxing and waning of an ideal? Or is it the typical lack of discipline we tend to find in younger generations? Inevitable paradigm shift? Or “Generation Plagiarism” impatient for a fast-food, corner-cutting solution to the problem that writing is difficult and good writing takes time and practice? One university official in the article reported that a majority of plagiarism cases at his university involved students who knew perfectly well that they need to credit the writing of others. They copied intentionally, he said, knowing it was wrong. They were just unwilling to apply themselves to the writing process.

Or did they just recognize that it was wrong in the eyes of the old dudes in charge? In the same article, it was argued that notions of originality and authenticity have changed with the popularity of the “mash-up.” Student writing now tends to mimic the sampling, combining, and synthesizing that is more and more evident in music, TV shows, and YouTube videos.

Students themselves are less interested in creating a unique identity as young people were in the 1960s. Instead, they try on different personas like they try on different outfits, enabled by social networking and file-sharing technology. Borrowing freely from whatever’s out there, this might just be a new model young person: “If you are not so worried about presenting yourself as absolutely unique, then it’s O.K. if you say other people’s words, it’s O.K. if you say things you don’t believe, it’s O.K. if you write papers you couldn’t care less about because they accomplish the task, which is turning something in and getting a grade. And it’s O.K. if you put words out there without getting any credit.”

Of course, this kind of “us and them, either/or” talk is dramatic and thought-provoking, but it doesn’t really reflect what I’m thinking. Indeed, when it comes to technology in education, the polarized perspective is neither reasonable nor practical. The genie is already out of the bottle – the laptops are already in their hands at TAS. Indeed, the students in my classroom do not know a pre-MySpace/Napster era.

Now, for me, it’s about tapping into the potential of electronica to enhance education through its powers of motivation and customization and collaboration. AND balancing this with some old-fashioned discipline of the mental, social, and physical variety. Why? Because the gatekeepers are still the old dudes. The college admissions officers still care (somewhat) about school grades and SAT scores. But the revolution is coming and the students will be leading it. So, as Jeff Utecht commented during his TEDxTalk, I need to be socially networked, personal-learning-network-connected, or just frequenting Facebook, to see it coming!

Technology & Twisted Fiction: Does Bradbury’s twisted future have anything left to teach us?

I’m looking forward to helping Ray Bradbury exercise his dystopic demons again this year. I have been studying his Fahrenheit 451 with my students for the past six years and invariably he leaves them worrying and wondering, at least a little, about the state of our society.

This is the unit of work that our group of three Grade 8 English/Social Studies teachers has identified for enriching with technology to meet our Course 1 goals, and our teaching goals too, of course.

It amazes me, actually, that students engage at all with Bradbury’s metaphor-manic prose. As I explain to the students, whose eyes have just about rolled to the back of their heads by the end of page 2, this novel is really the author’s soapbox stand. It’s an urgent message he wants to get out to all of us about censorship and conformity, about misguided attempts to find happiness, about loss of humanity, about being entertained into mental oblivion … And about the impact of technology on society. He could have gone into politics, or the music industry, or taken to graffitiing important monuments with his message. But instead he wrote a book. I am pretty blunt – I don’t promise a compelling plot or rich characterization (after all, they have just been indulged with John Steinbeck’s Of Mice and Men). I tell them they will have to read Fahrenheit 451 as detectives. They need to wade through the clues of Bradbury’s literary puzzle, and take in the sprinkling of storyline on the side, to unearth the rich themes which are strikingly relevant today for a book that was written in the early 1950s about a future imagined in the 24th century.

For much of the story we see technology perverted to the whims of a faceless, nameless all-powerful government/media monopoly that seeks to maintain control through a devious mix of censorship and entertainment. These “authorities” have taken advantage of the citizens’ increasing laziness about wanting to know what’s going in the world, about wanting to truly learn about life, warts and all. Instead, people are increasingly satisfied to be passive, unquestioning recipients of a wave of empty factoids that are steadily beamed at them through interactive big-screen TVs that have substituted for walls in people’s homes, and also through the radio earbuds (iPod precursors?!) which they wear constantly, plugging up their ears to the point that citizens have learned to lip-read so as to avoid the necessity of dividing their attention between the gripping infotainment streamed directly to their ears and the people they live and work amongst (but with which they don’t really engage). Not only has technology enabled mental laziness, it also facilitates physical laziness by taking over basic daily functions, such as passing the breakfast toast from toaster to plate. It also comes into play when it’s time to pick up the pieces of the citizens’ sad, empty lives. Drug overdoses are so frequent, that doctors no longer attend this kind of emergency; instead, a machine has been invented to do the stomach-pumping and blood transfusions, and it requires only a machine operator to plug it in and turn it on. Citizens are both willingly and unwittingly giving up their autonomy to technology.

Bradbury’s futuristic society seems to be a sad and sorry mix of George Orwell’s 1984, in which the citizens are oppressed by their government, and Aldous Huxley’s Brave New World, in which citizens are oppressed through their addiction to amusement. It is all very oppressive!

Fortunately, the last stages of the book reveal an alternative future. One rebel has developed a two-way radio earbud so he can stay in contact with his new friend, the protagonist, and so together they can fight the forces of evil. And finally, at the end of the novel, our hero meets a group of outcasts who seem to have technology firmly in hand – quite literally, in the case of the televisions, which have been downsized into hand-held devices, handy for news updates but also easy to put down and walk away from.

This leads me to wonder how easy it is for us to walk away from technology these days. It is a question I put to our students and we have, for a number of years now, conducted our own social experiment with a voluntary technology fast – or, rather, e-media fast.

Every now and then, I read of other teachers and students pondering the same question. Just this month a high school in Portland, Oregon, attempted their own “fast” to prompt the same kind of self-reflection: “Students at Portland’s Lincoln High School unplug, experience life without technology”. Ironically, by switching off, kids are prompted to switch on their consciousness: “You don’t have to go live in the woods, but you have to be conscious about how you use these electronics,” one student reported in the article. “When people are educated about what they’re doing, that’s when they can make a personal decision to change.”

My own students have recounted widely varying personal responses. (We have tried to do it over a holiday period, because some level of laptop use would be required on a typical school day.) Some highly active kids just get on with their physical activity – no big deal. Highly social kids organize get-togethers in real time and space, instead of cyberspace – they plan sleepovers, bake cookies, and hang out together. For some, it takes organization and effort to stave off the boredom and disconnection they feel without technology. For others, it is surprisingly tough and they quit early. A small group invariably reports that they know they couldn’t get on with life without technology, and so what? They don’t have to contemplate a life without technology, anyway, so what’s the point? They are self-confessed addicts, but the addiction seems pretty harmless. Or is it?