A Glimpse into the Future

I did the simplest thing in class yesterday. It was genius – by which I mean that I had just experienced one of those “A-ha!” moments, when, for a moment, I saw with great clarity exactly how a tech tool could enhance my teaching…with ease!

I wanted to introduce our next novel study to the class: Fahrenheit 451 by Ray Bradbury. We read this novel primarily to explore its themes and what the author wants to teach us through those themes about the state of our society. As both a Grade 8 English and Social Studies teacher, I wanted to make a connection to our current study of Ancient Greece. Socrates was the obvious dot to which I wanted to connect Bradbury because Socrates advocated critical self-reflection and the rigorous questioning of taken-for-granted assumptions. Two of his most famous quotes indicate the goals of his philosophic introspection: “Know thyself” and “The unexamined life is not worth living.”

As serendipitous as it was that I was concurrently introducing Socrates and Bradbury, I did not want to take up a lot of class time with this point. So, how could I make the connection brief, yet engage students in the exercise (rather than just standing up front and telling them point blank, and then having many of them miss it or instantly forget it).

At this juncture, my mind alighted upon one of Tom Daccord‘s little gems, as shared with Social Studies teachers at a recent middle/high school workshop. Fusing this with the “million dollar job” activity from Jeff Utecht’s August 28 Course 1 intro day, I had the makings of a cool little exercise.

First, I told students to use the window of class time they had once they were finished and had submitted their Ancient Greece section quiz, to find out more about Socrates. I encouraged them to go to any online reference tool they liked to use to do a 5 minute skim-read with the purpose of identifying Socrates’ main philosophy. Then students were to summarize his main teaching point in a single sentence – it could be a quote from the reference site, or one of Socrates’ own, or a sentence composed by the student. While they were at this, I took the next step:

By filling in the blue form boxes at TodaysMeet - a matter of a few seconds – I had created almost instantly a simple space on the web for a synchronous conversation at www.todaysmeet.com/socrates. I projected the webpage with this simple URL, and students joined me, also in a matter of moments.

I told students to cut and paste their sentence into this chat. Once individuals had finished this step, I told them to watch as more sentences were added, and review those already submitted, with the goal of looking for trends – what ideas came up most? In this way, every student was engaged with the task, and every student reported to the class. And it all took at most 10 minutes.

Finally, I scrolled through the transcript of the chat and elicited from students the words and phrases they could see were repeated most often. The rationale was that any one individual might have misunderstood what she/he had read or been simply a bit off the mark in describing Socrates’ critical ideas, but that probably overall the classroom crowd would have correctly identified the key ideas, so we could boil down the chat input to those basically cross-referenced points.

Sure enough, key words like “question” were repeated, and one or both of the quotes I had hoped they would stumble across, stood out from the list of sentences. In a matter of collaborative moments, I believe the students had gained a clearer picture of what Socrates stood for through a process of social meaning-making enabled by a Web 2.0 tool. And, also, by me. Connectivism on multiple levels.

With the connection iterated, and an essential understanding for the novel study deeply etched along my students’ neural pathways, now they could all turn their attention to reading and self-reflection with Bradbury. It had all been so very simple, student-centered, self-differentiating, publicly accountable, high interest, and easily replicable.

Erik J and Chris F will be sharing details in their blogs about the joint Course 1 Project that we undertook to enhance our Grade 8 English curriculum with a GoogleSite-based discussion and file-sharing, with a VoiceThread activity thrown in. Putting this project together took quite some consultation but we are satisfied that the tech tools we’ve added will certainly enhance our Literature Circle discussions of Fahrenheit 451. I’m sure our efforts will reap rewards for the students in terms of personalizing and internalizing the author’s thematic lessons through collaboration.

But my favorite result of this process so far was that light-bulb moment for me in the classroom, when I saw that it doesn’t all have to be blood, sweat and tears. Technology tools offer instant gratification for old digital immigrants like me too!


Where Technology, Society, and Personal Lives Intersect …

…And what I might be doing as the teacher at that point of intersection

Personalizing and globalizing… Upon reflection, I realize that this same pattern of impact – both helping me learn and develop personally and at the same time connecting me globally – is obvious in my own life. The Web is integral to my everyday life and I know this most acutely when I am sitting down somewhere without it – stuck on the high speed rail to Taichung on the weekend without a consistent internet connection, or sitting at Starbucks in Tien Mou when my “Wi-fi” card has run out.

After all, where would I be as an Australian expat in Taiwan without email, Skype, Flickr, and Facebook? Even my 6 year old daughter is emailing her friends, has joined an online community of MoshiMonsters gamers (prompted by her Australia-based cousin), uses an Australian-authored interactive reading site called Readingeggs.com to develop reading skills and enter writing competitions and participate in voting for the winners (of course, initially at my suggestion, but now under her own steam); and these days she is pestering me to please set up her blog (not “a” blog, but “her” blog!) so she can share her photos and commentary with her friends. Doesn’t sound too sinister or socially isolating, does it?! Quite the opposite, in fact!

Interestingly, I have noticed that my own online practices see me frequently in touch not only with friends and family from afar, but even more often, with those people that I see every day in Taipei – colleagues, friends, and even the students who I see face-to-face for 95 minutes a day! I have to ask myself how different even my teacher-student relations would be if not for email and Gmail Chat, how communication and learning have changed with the new forums for exchanging ideas synchronously and a-synchronously online.

According to Danah Boyd, Social Media Researcher at Microsoft Research New England and a Fellow at Harvard University’s Berkman Center for Internet and Society: “We’re addicted to our friends, not our computers.” Put this way, the use of technology certainly doesn’t sound particularly socially apocalyptic. (She also co-authored the MacArthur Foundation’s book Hanging Out, Messing Around, Geeking Out: Living and Learning with New Media with the Digital Youth Team.)

In fact, a really very interesting TEDTalk by Stefana Broadbent recently highlighted how social media and social networking tools are actually reuniting the public and private sphere in a way not seen since prior to the industrial revolution. At that time the world of work moved away from cottage industry, people left their homes and villages to go to work, and to be schooled to be ready to go to work, and thus the fabric of relationships amongst family and friends was seriously frayed.

YouTube Preview Image

Broadbent is described as “one of a new class of ethnographers who study the way our social habits and relationships function and mutate in the digital age” and her research shows that the brand-new tools at our disposal are not spoiling but rather cementing human intimacy – even across old barriers such as distance and workplace rules. For example, she says that many people now spend more time writing to their friends than talking to them.

Trawling through the comments on her TEDTalk was also an interesting process, because certainly not all viewers felt persuaded (unlike the glowingly supportive comments to Sir Ken Robinson’s TEDTalks, for example. Is that because he didn’t focus primarily on technology and society, but education and society – a more well-worn topic familiar to every viewer at least from their own schooling experiencing, and therefore with less room for a lack of understanding which is so often followed by fear of the unknown?)

Some respondents were uncomfortable with her sociological standpoint which sounded to them more like intuition and anecdote than real scientific research. One pointed out that Broadbent does not explain how it was that so many people were perfectly content in pre-digital times with not being able to communicate with friends or family during working hours. Now, conversely, so many of us are anxious because of the pressure of constant contact and the expectation to “have something to input in response to the alarm.” New tech tools have allowed us to do more, so now more is exactly what is expected.

Still others insisted that we might be communicating more, but on a more shallow level. “We are becoming a society of page skimmers” and, again, that issue of attention deficits growing was pointed out. Meanwhile, basic standards of communication are dropping, it was claimed: “I’m embarrassed for my peers, who struggle to form complete sentences, let alone spell them correctly. I pity my teachers, who are great at what they do and present this mind blowing lesson to a bunch of young adults who are too busy texting, ‘wat u think, shud i c him again 2nite?’ Let’s not forget that many of these people are texting while they drive, and that’s as bad if not worse than driving drunk. There’s a time and a place, and we need boundaries.”

The last point rings true for me too – the idea that we need some boundaries. That parents need to make sure their children are getting enough sleep. That our governments should continue to prosecute drivers who break the law by phoning while driving. In many cases, these boundaries already exist.

And, regarding the issue of basic writing skills – I feel compelled to share some interesting research on that. In Getting It Right: Fresh Approaches to Teaching Grammar, Usage, and Correctness (Theory and Practice) by Michael W. Smith and Jeffrey Wilhelm, the authors cited findings that might seem counterintuitive to the common perception that writing standards really are in crisis due to texting, IM-ing, and the lack of deep reading apparently going on around us in the digital world. They discovered that students today are writing longer, more complex work for their college courses (more than twice as long, on average, as essays written in 1986, and more research-based essays than the previously more popular personal narratives) without a significant increase in the rate of grammar errors. (We need to remember, too, that the mass media thrives on bad news and the message that schools/learning is in “crisis” has been an easy newspaper selling point for DECADES – so, if there really is a crisis, it’s been around for a LONG time!)

It will be interesting to look back on the impact of blogging on informative writing in general - journalism and news writing, of course, but also how this bleeds into formal academic writing too. Surely we must already be seeing in student writing, as we do in serious but nevertheless mass media publications such as Time and Newsweek, an increasing trend to be personal and informal instead of scholarly in the strictly third person sense. I imagine that the criticisms about declining standards in writing skills will continue – because more informal and personal, means more conversational, which means less sophistication in word choice, right? Or does it? Because, in fact, I feel I am seeing more frequent recourse to analogy, more reinvention of cliché, and certainly a lot more innovation in word choice these days as necessitated by the fact that we are often writing about things that didn’t exist 20, 10, 5, or 2 years ago. And to write about the un-familiar, people need to find ways to refer to the familiar, so there’s a lot of figurative language – which is sophisticated language in my book – in order to use comparison for the purpose of illumination.

But, I digress…  (Is writing organization going out the window too?)

Nevertheless, one further comment that resonated with me bemoaned that intimacy is lost, not gained, through close contact: “Having a limited amount of something makes it valuable. Having constant access makes each interaction less valuable …  if you never get to MISS the other person because these things keep you in such close contact.. then the value is reduced. …Thoughtfulness is a wonderful thing, but without moderation it isn’t thoughtful and deliberate, it is just convenience.”

Even so, convenience is changing our world, and enriching it, even if some of this comes at a price. Wouldn’t it be naïve to expect things not to change, or that something won’t be lost in the replacement of new ways for old?

When it comes to education, there are still so many innovations in technology, and social media in particular, to talk about and get excited about, even if we are not quite sure what this means for the traditional role of the classroom teacher. For example, Chris Anderson’s TEDTalk on Crowd-Accelerated Innovation takes a phenomenon that we are all watching and tells us some surprising things about what this means. Anderson has been in pole position to watch this going on because he is the curator of TED.

YouTube Preview Image

Basically, he observes that the sharing of performance and achievement online – he cites Web-taught dancers, and even TEDTalk presenters – has resulted in better and higher performance and achievement. It’s almost as if, with each recorded effort, the presenter finishes with the same challenge for the next person: “Step your game up!” So viewers-turned-participators have done just that. People watching Web video have been drawn into a self-generating cycle of improvement. Anderson has termed this “crowd-accelerated innovation.” Three requirements for this phenomenon to occur are:

  1. The crowd: a group that have a common interest. Amongst this crowd will be “commenters, trend-spotters, cheerleaders, and mavericks,” but the larger the crowd, the more innovators will be included.
  2. The light: access to view the efforts of the best of the group, so other group members are able to learn and better themselves.
  3. The desire: because innovation is hard work.

With these three ingredients, crowd-accelerated innovation can happen on a street corner, as it does in the case of street dancers, for instance. However, the internet has expanded all three of these elements exponentially. The crowd viewing the performance is now global through online video. And the crowd shines the light on the best of the best through comments, ratings, Facebook, Twitter, links on Google, etc. PLUS the desire factor is ratcheted right up because any kid with a webcam can be viewed by vast audiences. Woot!

This kind of global recognition and opportunity is driving huge amounts of effort, says Anderson, and because everyone can see it, the benefits of that effort, the learning, is shared with everyone. And the light and desire combine to attract yet more people to the crowd, and so the cycle swells.

Anderson is most excited about the potential of this model to inspire radical openness – amongst companies and institutions of all sorts. Because to attract the light, you need to open up. “By giving away your deepest secret, that millions of people are empowered to improve it.” He points out that his is not revolutionary thinking at this point – even Isaac Newton knew full well that he stood on the shoulders of others, that innovation is usually a collaborative effort.

The thing that is really new in all this is web video. For the first years of the internet, video files were prohibitively large for the infrastructure of the web. But now this has changed. Incredibly, “humanity watches 80 million hours of YouTube every day. Cisco estimates that within 4 years nearly 90% of the web’s data will be video.” This is because video is often superior for communicating information and ideas.

Anderson claims that the “video-driven evolution of skills” from emulation to innovation will lead someday soon to dramatically accelerated scientific advancement as scientists from around the world can push past the limits of words on paper and see for themselves how to replicate experiments. Five hundred years ago the printing press allowed innovators and educators to spread their ideas far and wide, but now we are experiencing another tectonic shift in communication. (Have I caught the wave too close to shore – is blogging already passé? Will reading and writing fade away with a resurgence of the oral tradition transposed to cyberspace? At least, teachers right now have every reason to be promoting public speaking skills!)

The results for education are surely going to be interesting, and far-reaching. The rotten teachers from the When I become a teacher YouTube clip won’t need to be the reason a child is held back anymore. That child can access a far superior teacher online. That teacher can be anyone with access to a webcam. That teacher can be anyone who resonates with the child, anyone with a teaching style that matches the student’s learning style. That teacher can teach any subject matter of interest to the learner. That teacher can be anywhere in the world. That teacher can teach on the student’s own timetable. And with this capability to self-manage education, the student will almost certainly do so. And with the capacity to innovate and communicate virally, the learner can become the teacher. Thus social media will facilitate “the biggest learning cycle in human history.”

“Welcome to the Collaboration Age” says Will Richardson on Edutopia in reference to the Web-enabled “transformative connecting technologies” which have drawn one billion people online with the potential to draw them all together in shared experiences and opportunities to do good in the world. What unprecedented potential, if we can only figure out where we fit into it as educators…

Richardson conveniently summarizes in three questions most of the challenges of Collaboration-Age technology I have recently blogged about:

  • “How do we manage our digital footprints, or our identities, in a world where we are a Google search away from both partners and predators?
  • What are the ethics of co-creation when the nuances of copyright and intellectual property become grayer each day?
  • When connecting and publishing are so easy, and so much of what we see is amateurish and inane, how do we ensure that what we create with others is of high quality?”

With all the question marks swirling in my mind at this point, it was with great relief that I encountered Sugata Mitra on TED. Like Anderson, Mitra communicates the same possibilities for a learning revolution through his Hole in the Wall experiments.

YouTube Preview Image

Mitra shows that where children have interests, learning will happen – even in places where good teachers won’t go (and every country on earth would have to admit to having some of these places). That’s because the learning will happen anyway – “kids will learn to do what they want to learn to do.” Not revolutionary stuff really, except for the experiment he used to test this hypothesis. Mitra stuck computers, literally, into the walls of slums … and then washed his hands of them. He did this in a variety of God-forsaken places, and after several months, he would return to discover the same phenomenon over and over: that the computers had enabled incredible instances of learning … entirely without teachers.

This is a rather spurious statement, of course. It would seem to suggest that the learning was only possible because of technology. But this would be to miss the key components – computers were put into the hands of the students AND the students determined what it was they wanted to learn. AND the learning took place “in the plural.” It was incredibly social. Shared meaning-making. The students worked collaboratively and collectively to deepen their personal learning about an area of genuine interest to them.

In subsequent experiments, Mitra added another element to see if the learning could be enhanced further. And, indeed, it was. He added an individual to the equation, someone who would “use the method of the grandmother” which is to “stand behind them and admire them all the time.” He gave some concrete suggestions (and I wrote them down straight away to use in my next class and all future classes I have the good fortune to teach): “That’s cool. That’s fantastic. What is that? Can you do that again? Can you show me some more?”

…And this is where I let out a deep breath, because finally I had seen a vision of the future of education with a place for me in it! Finally, a picture of what my classroom could look like as a more technology-enriched environment. It reassured me that technology doesn’t have to do away with me and my job altogether. The learning is still better for it happening in my classroom – where students can come together with a similar interest (and at my school, the Holy Grail is still acceptance at a top US university), access technology they may not be able to access at home, and learn both collaboratively and face to face, yet not be inhibited by time or geography.

A classroom-based community as one point of connection for the billions of potential points of connection in a globalized, personalized online education system. And me doing something I feel I can do really well, which is personally connecting with and encouraging kids.

In some ways, it’s a simple, beautiful vision.I wonder if I’m getting too carried away by the romance of it (but, remember, this quality will serve me well in my grandmother/teacher role of the future)! Maybe it leaves the teacher as simply a connector of dots.

Still, I think I’m going to hold onto these images of kids crowding excitedly around computers, in groups, with a cheerleader whipping up enthusiasm over their shoulders, as the ultimate Course 1 take-away.

A Re-education for Would-be Revolutionaries

Another group funneling money into technology for the purpose of enhancing education is the Bill & Melinda Gates Foundation – a $20 million dollar injection of grant money, in fact. The Next Generation Learning Challenges initiative to support the work of the nonprofit EDUCAUSE, is focused on improving college readiness and completion through expanding the use of IT tools. In an October 11 eSchoolNews article Bill Gates explained the impetus this way: “American education has been the best in the world, but we’re falling below our own high standards of excellence for high school and college attainment. We’re living in a tremendous age of innovation. We should harness new technologies and innovation to help all students get the education they need to succeed.”

All of this seems very laudable, but I do wonder how much change and improvement we will see in education while we focus on the tools rather than the overarching paradigm – because, as Sir Ken Robinson has commented so compellingly in his TEDTalks (Ken Robinson says schools kill creativity Posted: Jun 2006, and Sir Ken Robinson: Bring on the learning revolution! Posted: May 2010, and also the following blog post: TED and Reddit asked Sir Ken Robinson anything — and he answered Posted: August 2009), the educational paradigm under which we are all operating, and into which we are, in fits and starts, trying to employ these IT tools, is seriously misguided. The whole system needs overhauling to fit the demands of our modern, post-industrial society. The entire philosophy just doesn’t make sense anymore.

To start with – and there are many ways our current education system lets students down, to the detriment of society as a whole, so there are many places we could start with this argument – but one obvious way we know our institutions are failing us is the high number of people with degrees who can’t earn a living because they can’t find a job. According to an article in the New York Times, even “Doctoral Candidates Anticipate Hard Times” in our faltering economy. For so long we have operated under the assumption that getting a college degree is the ultimate goal of going to school – that a degree will ensure employment and financial security. And yet students who are achieving the ultimate accolade – the PhD – still can’t get a job. Robinson, in the Q&A blog post, calls this the “biggest fault line” in our education systems right now. When the “basic currency of education has defaulted” we’ve just got to think again about what we’re doing it all for.

And Sir Ken is happy to help us with this. He calls for a radical rethink of school in order to cultivate creativity – the most critical commodity in these 21st century times.

YouTube Preview Image

For our own personal fulfillment and for the betterment of our global community, we need to stop “educating people out of their creativity” and start inspiring it. This also requires that we recognize and nurture multiple types of intelligence to get the best out of people.

Robinson should know what he’s talking about. After all, just over a decade ago at the behest of the British government, he led a massive inquiry into creativity in schools and the significance for the economy. In 2003, he was knighted for his efforts. Then, in 2009, he published a book on his ongoing investigations into creativity and education: The Element: How Finding Your Passion Changes Everything.

But perhaps it is his engaging TEDTalks that have disseminated the message most powerfully.

Bottom line according to Robinson – our current school system educates students to become good workers, rather than creative thinkers. Therefore, individual creativity has, and continues to be, ignored, stifled and/or stigmatized. It is simply anathema to the industrial revolution model of mass production.

Yet times have changed, even while the education system has not. When Tom Daccord workshopped with TAS Social Studies teachers on September 22-24, he introduced us to Harvard economics professor, Richard Murnane, and his research into the evolution of workplace skills: The New Division of Labor: How Computers Are Creating the Next Job Market (published in 2004 with MIT professor, F. Levy). Murnane’s review of economic and civic changes since the introduction of computers to the workplace shows that routine manual skills and routine cognitive skills have been massively devalued through technology and automation. Meanwhile, complex communication skills and expert thinking skills have become increasingly in demand compared to 1969, with an upsurge in the 1980s.

This makes a lot of sense to me. Computers have replaced human tasks and responsibilities all across the employment spectrum. Even the infamously complex and long-winded process of checking in at airports has miraculously given way to self-service. (I stepped past the self-check-in stands initially for the familiarity of the face-to-face routine, but the last few times I lined up at the airport, it was to get the job done myself. Consumers like me are voting with their feet.) Clearly, computers do very well in “if-then scenarios.” But what they can’t handle is a new problem. They deal with existing data, so if it’s a new scenario, they are lost. Humans still have the upper hand through our ability to adapt and innovate in response to changing circumstances. We also have a distinct advantage when it comes to communication – we trump computers at complex social interaction. Therefore, what the world of work demands from graduates is excellent communication skills and creative thinking/problem solving skills. If the purpose of schools is to prepare students for the job market, our educational curriculums need to shift away from standardization and routine manual and cognitive skills.

And we need to give our employees-to-be access to technology in schools because, as Daccord cited for us, the fastest growing job sector is technology; 15-18% of new jobs in the next decade will be in technology. Therefore, while technology is taking jobs away from humans on one front, it is also creating jobs on another.

Daccord brought in another “big gun” to reinforce this point. He canvassed Daniel Pink’s book, A Whole New Mind, to spotlight once again the shift from an Industrial/Information Age economy (which relied on left-brain skills) to a new Conceptual Age (which emphasizes right-brain skills).

The 2001 make-over of Bloom’s Taxonomy was prompted by exactly these concerns that our education system needs to prioritize creativity. The top spot on the order of thinking processes is now reserved for the verb CREATE.

In case educators need some prompting about how to “create” with information and communication technologies, Andrew Churches added some digital jargon in 2008 ( “Bloom’s Taxonomy Blooms Digitally” in Tech & Learning):

Creating = designing, constructing, planning, producing, inventing, devising, making, programming, filming, animating, Blogging, Video blogging, mixing, remixing, wiki-ing, publishing, videocasting, podcasting, directing/producing, creating or building mash ups.

While some educational theorists are reinventing old taxonomies, others are developing new learning theories. George Siemens (in Connectivism: A Learning Theory for the Digital Age) explains that the old theories of behaviorism, cognitivisim, and constructivism are limited simply by virtue of the fact that they pre-date and therefore do not take into account the impact of technology on education. Connectivism attempts to do this.

The case for a connectivist theory of learning includes the same arguments about job market trends as I’ve mentioned previously. We know that our future graduates will work their way through a variety of jobs in different fields of employment over the course of their lifetime. Higher order thinking skills – creative problem solving skills – are gaining increasing currency as far as employers are concerned in today’s knowledge economy. “Know-how and know-what is being supplemented with know-where.” In other words, learners need to understand where to find the information they need, and how to evaluate, reapply and transpose it to new situations.

Furthermore, Siemens argues that a new theory of learning needs to take into account new technology-enabled trends in learning. For example, formal learning is giving way to informal learning through personal networks and online communities, as well as being ongoing in the world of work. In addition, the technology tools young people are using is rewiring their brains, changing thinking and learning. Another line that is blurring is that between the individual effort and organizational learning.

At the heart of it is connection-making. Identifying connections between disciplines, fields, and concepts is key, as is nurturing and maintaining connections in order to facilitate continual learning.

We know this is an essential understanding that Jeff Utecht wants us to take away from Course 1 – that the internet is really a mass of connections, and that these connections “trump content.” That the advent of the internet hasn’t left us all as isolated hermits withering away in the perpetual darkness of our home offices, ordering pizza online while the old pizza boxes pile up and fester around us, as Sandra Bullock’s old movie, The Net, would have led us to believe.

Instead, the internet has played host to an ever-growing number of online communities and changed the landscape of people’s social lives, student lives, and work lives.

In Jeff’s TEDxTalk, he reeled off an incredible statistic from the CEO of Google, Eric Schmidt: “We create as much content in two days as we did since the beginning of mankind until 2003.” That’s an incredible amount of information out there right now, and an unfathomable amount yet to be created. Knowing how to connect to this information, and knowing what to do with it, will henceforth be critical to success and satisfaction in our personal, social, and working lives.

The Horizon Report 2010 underscores these same points about the need for educators to focus on helping students navigate our new information-rich, highly connected, increasingly collaborative world. No doubt Sir Ken is pleased to see the report recommend strongly that schools emphasize “critical inquiry and mental flexibility” and also provide today’s learners with the necessary tools to engage with broad social issues and tackle large-scale civic action.

So now watching Ken Robinson exhort us to “Bring on the Revolution” has begun to seem less revolutionary to me. Still compelling. But more and more like “old news” – because Murnane, Pink, Siemens, Utecht, and others have been chomping at the bit about this for some time already.

YouTube Preview Image

In fact, it’s all starting to sound like common sense.

Actually, I feel as though there are two things that need to go on at the same time here. On the one hand, educators need to facilitate social and global connectivity to enhance learning. On the other, we need to remember that technology in education also offers huge potential for personalizing and customizing education. One way this is already happening, driven by student-consumer demand, is with online learning. A recently published study showed that the number of students enrolled in some type of online course climbed from 50,000 in 2000 to more than 1 million in 2008. It was reported that these are typically high-schoolers taking courses not available at their local school, or they are catching up on classes they did not pass the first time around. But most interesting is the number of students receiving their entire education online: 200,000.

“For better or for worse, imagine a near future in which your avatar can attend high school in a Second Life-like environment, your body no longer required to sit quietly in a row and your mind no longer obliged to settle for what the local district can offer. You won’t need a locker, and if you realize with swooping horror that there’s a big test today and you’re not ready, you can stop time and study until you are. And your avatar’s skin is clear. And you can fly.”

Yes, technology can certainly provide more personalized learning – a prerequisite for nurturing individual creativity. Thus teachers – yes, that’s me – I need to be more imaginative and creative in how I use IT.

Why me? Because, as Sir Ken reminds me, education is what is happening in my classroom, in the malleable minds of the individual students that I see for 95 minutes every day. Not in the school board room, not in the offices of school administrators. Not in government committee rooms or in international think-tanks on education. But in my classroom. It’s what’s happening in the brains before me. The education my students are getting is the result of what I’m doing with them every day. So it’s my everyday practices that need to change.

Right in Front of their Faces

I suppose at some point all the whining, worrying, and complaining needs to give way to action, which takes into consideration the fact that the “electronica” is here and the kids are all over it. As educators we know that good teaching starts where the kids are at, and then prompts them to go beyond. So, I guess, that’s where I’m going – just mindful of the excesses.

Certainly, there are plenty of examples of educators and educational institutions capitalizing on their students’ interests and taking their curriculum and learning goals to the students – instead of expecting it to work the other way around.

Purdue University is one such example. According to an October 6 article, a technology team at this university have taken the concept so far as to marry together student learning and Facebook social networking.

Although Facebook is often cited as a distraction to serious learning in schools – at TAS it is blocked to middle school students – the Purdue team’s efforts at “Mixing work and play on Facebook” have led to the creation of an application that sets up an e-learning environment within the site. “Mixable” operates a lot like a traditional study group, except that the meeting space is on Facebook, and the learning and materials are shared and managed by student users.

Using course registration information, Mixable provides a virtual space for students that adheres to the basically free-form principles of Facebook use: users are free to discuss whatever they want, to make posts available to some users and not to others, and to participate or not. In order to “bring academics into social media,” the application does much of the administrative work such as automatically creating groups and pages for students enrolled in the same courses. It also organizes files the images, videos, links, podcasts, and documents that students post to the course page into library spaces, so they can be accessed more readily – all within Facebook.

In order to explain why this learning management tool has been created specifically for Facebook, Gerry McCartney, Purdue CIO, invokes the bank robber, Willie Sutton. When asked why he robbed banks, Sutton apparently said, “Because that’s where the money is.” McCartney follows this up with: “So why go to Facebook? Because that’s where the students are.” Perhaps Facebook could yet catch on as a learning interface like some other Web 2.0 tools have. It’s in the hands of the students now, quite literally.

For Better and/or Worse

“Teachers have always had a responsibility to help their students grow out of youthful isolation, digital or not, and became intellectually curious about the wider world around them.”

The digital world is a challenge to me, to other educators – and to our students – but it is also one that we must adjust to, according to Carol Jago, President of the National Council of the Teachers of English. In a September interview, Jago said that “the digital world offers benefits but also pitfalls for education at all levels.” (Listen to the interview with Carol Jago: Download)

While our focus as educators should clearly be on enhancing our curriculum with Web 2.0 tools and other “edtech” innovations, we still need to be mindful of what students are doing with technology aside from our curriculum work, because it certainly has an impact on the quality of the work.

We know, for example, that young people absorb themselves for hours and hours at a time with online chatting, video-gaming, and variously dabbling with digital devices. A Kaiser Family Foundation study cited by Jago found that “people between the ages of eight and 18 spend an average of seven-and-a-half hours a day “plugged in” to digital devices.” Of course, this is going to get in the way of studying and homework. Jago echoes Prensky’s solution: teachers must figure out how to transpose those things that make gaming, texting, and Facebooking so fascinating into our instruction, (although she stops short of Prensky’s opinion that turning lessons into video games will go a long way to solve the problem).

Digital distractions are disparaged for another two-pronged problem they create. Not only do they distract from the time students have to sit down with a book and read, but as Jago worries, Web-based reading may also discourage deep, analytical reading.

But technology seems to be Janus-faced – for every problem created, a plethora of possible solutions open up. A Scholastic study released at the end of September found that many kids want to read books on digital devices and would do so more frequently if they had access to e-books. The findings from a survey of more than 2,000 children ages 6 to 17, and their parents, highlights the potential of e-readers, computers, and mobile devices for encouraging digitally-savvy students to read more.

Kids have embraced this technology ahead of their parents. The statistics showed many more kids had used e-readers than adults, and many more were interested in doing so. Obviously, young people viewed the same tools they use for socializing and gaming as opportunities to read.

Still, many parents worry. This study aligns very much with my own conversations at parent-teacher conferences over the past couple of years (and coming up again next week!). Parents really worry about whether their “modern multi-tasking adolescent” has the perseverance to get through an entire novel. They worry whether their sons and daughters have developed stunted attention spans, too engrossed in fast-moving ideas to voluntarily restrict themselves to recreational reading.

I am certainly sympathetic to these concerns. I guess my best response is to present research findings, such as were reconfirmed by the Scholastic study, that parents can still have a big impact on their children’s reading lives by providing interesting books to read at home and also by setting limits on time spent in front of the computer – particularly with the goal of ensuring kids get adequate sleep.

But there’s still more to worry about.

The report confirmed what many of us already suspect: that children are altogether too trusting about information they find on the Internet. The alarming statistic = 39 percent of children ages 9 to 17 said the information they found online was “always correct.” Obviously, this needs to be addressed through collaboration between library media specialists and classroom/core teachers in efforts to raise information literacy. We can’t just assume that somebody else in school is taking care of this – the alarming statistics are a salient warning. Better to reinforce website evaluation skills, for example, every year than assume another teacher or subject has got it covered.

How else is technology, or more specifically, social networking impacting our students’ lives? Well, again, both for better and for worse. The double-edged sword of perpetual connection was chronicled in an Associated Press-mtvU poll, released October 7. It showed that 57% of students said that life without technology would be more stressful, and yet a significant 25% said it would be “a relief.” The pressure to keep up with text messages and Facebook communication causes a lot of stress – this according to a majority of respondents. Waiting for replies to a message is also stressful, as is the process of interpreting messages. Nearly half of respondents worry if the messages they receive are jokes.

News stories grab headlines with worst-case-scenario stories of cyber-bullying and suicide, but these are a drop in the ocean compared to the low-level but perpetual stress that comes from being tethered to technology.

On the other side of the equation, though, sites like Facebook provide new and increasingly popular avenues for seeking emotional support. Yet, the very public nature of social networking means that cries for help come at both the advantage and disadvantage of exposure. Young people are both more visible and more stressed about it because they can’t always control the information about themselves that’s available online.

More statistics from the study provide evidence that social networking is both a blessing and a curse. They show “a window into a world where 8 in 10 students say their lives are happy—yet 6 in 10 say they’ve recently felt too stressed to hang out with friends, an increase over the past two years. Similar numbers say they’ve been too agitated for school work. Twenty percent say they have a friend who has discussed suicide over the past year, and 13 percent say a friend has tried to kill himself or herself. Nine percent have considered it themselves.”

These are the complex forces at work in the minds and bodies that come into my classroom every day. I feel for these kids who don’t have the advantage of anonymity that we had. Flippant remarks I may once have made are lost in time (I like to think!); yet, they are permanent digital footprints for our students. Things that we used to say verbally are now online – permanently “out there” – and searchable.

The following video reinforces the point that kids have to think twice before posting anything online. I will use it in conjunction with the second video, which was greeted by stunned silence and then a very excited discussion last year when I shared it with my classes:

YouTube Preview Image

YouTube Preview Image

Once again, I am reminded that educators are responsible for more than just their curriculum – the success our students show with our curriculum is utterly impacted by what is going on in their very public private lives. So their real and virtual lives are very much our concern as well.

The Costs of Constant Connection

A Newsweek article from August this year – “Lost in Electronica, The costs of ‘the chaos of constant connection’” – gave me pause. It asks us to re-think boredom – usually considered a bad thing – as a special privilege of the complex brain, and thus an aspect that distinguishes us from our animal friends. George Will posits that our capacity for boredom is essential to our humanity because it means we have the mental ability, and the space, to reflect and plan, space for empathetic thinking and community-action. A vital ingredient of global citizenship. And one that is increasingly absent in our constantly connected, constantly stimulated society.

Taking a cue from the evolutionist argument, Will suggests that our efforts to skirt boredom can be explained as the response of a brain formed in dangerous prehistoric times and wired to be constantly alert and vigilant. Our modern lifestyle has brought with it an unprecedented climate of safety and, thus, sedentary brain function, but our minds still seek stimulation and find respite from boredom in audio-visual entertainments. New technologies now allow young people’s brains to remain in an almost constant state of being “switched on” so that boredom can be utterly assuaged.

Will is actually reflecting on an article, “The case for boredom,” by clinical psychologist, Adam J. Cox, author of Boys of Few Words: Raising Our Sons to Communicate and Connect (Guilford, 2006) and No Mind Left Behind (Perigee, 2007). Cox describes the minds of today’s teenagers as obese with the sudden abundance of electronic stimuli. Just as human beings have gorged on the far greater quantities of salt, sugar, and fat that the modern diet allows in contrast to our genome’s formative times, our modern minds now crave “junk nourishment” in the “ubiquitous barrage of battery-powered stimuli delivered by phones, computers, and games” which together form an “addictive electronic narcotic.”

As with all things electronic, the last half century has seen an exponential increase of this phenomenon. Previously boredom might have troubled the brain after an hour or two of nothing much to do. But these days, kids feel bored far faster. As constant stimulation and amusement becomes the “new normal” boredom is disappearing, and with it, by definition, the “available resources for thought, reflection, and civil behavior.” With “excess amusement” young boys (for they were the focus of Cox’s clinical work) are induced into a “pleasant trance from which they do not care to be awakened” and from which they “fail to launch” from self-centered adolescence into the adult world. Cox reminds us that being a responsible and contributing citizen is rarely fun – “it requires patience, forethought, and some willingness to tolerate tedium.” Thus, less boredom is less opportunity to practice civility and civic-mindedness.

The ominous corollary is that our constantly connected young people are stuck in an “electronic playground” of hyperstimulation. In fact, with the diagnoses of learning and attention deficit disorders going through the roof, Cox argues that at some point we just have to drop the label “disorder” and call it, simply, the way we really are right now. And the way we are is amused into a sort of self-absorbed oblivion.

Concerns about how our society is “dumbing down” are not new. Bradbury, of course, was on to it in the 1950s. Neil Postman was still on about it in the 1980s (in his book Amusing Ourselves to Death), arguing that politics, religion – rational argument and the quality of information – had all been diluted and made subservient to entertainment. He writes that consumers have basically, and voluntarily, surrendered their rights in exchange for entertainment.

While Postman pointed the finger at television, Cox points out that the available electronic stimuli have multiplied, with dire consequences for the next generation: “Unlike reading and listening to stories, the blitz of electronica doesn’t build deeper listening skills or a greater range of emotional expression.” He says, “Not only does withdrawal into electronica enable them to bypass the confusion and pain of trying to give their emotions some coherence, it also helps them avoid the realities of being a flawed, vulnerable, ordinary human being.”

Will adds further concern to this social phenomenon of stunted maturity with findings from the field of neuroscience. Brain scientists have shown that the “mature” brain is not a finished product and can, in fact, be rewired by intense and prolonged experiences. “Some research suggests that the constant short-term stimulation of flitting to and fro among digital promptings can impede long-term memory on which important forms of intelligence depend.”

Thus, Will expands Cox’s concerns to embrace us all. He argues that it is not just young boys and not even just the next generation who have all too successfully beaten away boredom. “Adults of both sexes, too, seem insatiably hungry for handheld devices that deliver limitless distractions.”

“We are in the midst of a sudden and vast social experiment involving myriad new means of keeping boredom at bay. And we may yet rue the day we surrendered to the insistent urge to do so.”

So, what do we make of Cox and Will, Postman and Bradbury? Are these just old guys on the un-cool side of the generation gap? Or have they got a (frightfully) good handle on the nature of the next generation?

Perhaps we could simply put it all down to that digital divide between natives and immigrants (to use Mark Prensky’s terminology to distinguish between people born before and after the advent of the computer). Digital immigrants stand out for their predigital dispositions; Prensky calls these digital “accents” in a blog for Edutopia (“Adopt and Adapt: Shaping Tech for the Classroom”). Having come to technology later in life, they tend to downplay the importance of new tools, concepts, and practices made possible in the new digital world, such as the importance of online relationships as compared to face-to-face ones. He states unequivocally, “Such outmoded perspectives are serious barriers to our students’ 21st-century progress.” Prensky positions himself squarely on the side of the natives in his 2006 book, Don’t Bother Me Mom–I’m Learning!, which champions the educational benefits of video game play. He argues that children “are almost certainly learning more positive, useful things for their future from their video and computer games than they learn in school!”

Educationally speaking, therefore, should I be glitz-ing up my teaching with technology? Should I be seeking to engage my students through entertainment? Video games and social networking sites – are these really my best allies in teaching and learning? Should I be making such efforts to go to where the kids are – “hanging out, messing around, geeking out” online – that education looks a whole lot like what kids are already doing with technology? So students don’t even notice when I am slipping in a bit of teaching on the side?
Or is it okay if learning is a bit boring sometimes?

“To be able to delay immediate satisfaction for the sake of future consequences has long been considered an essential achievement of human development.” So sayeth Shoda, Mischel, and Peake in reporting their famous (1990) experiment on delayed gratification in the American Psychological Associations journal, Development Psychology (Predicting Adolescent Cognitive and Self-Regulatory Competencies From Preschool Delay of Gratification: Identifying Diagnostic Conditions). So, should I put that marshmallow (or Oreo cookie in some replications of this experiment) on the desk in front of them? Or is a little bit of delayed gratification (and hard slog) a good thing for my students?

Certainly there seem to be some serious differences of opinion on either side of the digital divide when it comes to information obtained over the internet and concepts of originality – intellectual property and copyright. Some say that many digital age students simply do not understand that using words they did not write is a serious offense. If these digital natives have an “accent” then it is decidedly colloquial. Lazy, even.

On the one side, there is the perspective that the vast buffet of online information is open to the global community and therefore counts, basically, as common knowledge. Cutting and pasting has canceled out the concept of authorship. Digital natives – who have grown up with file-sharing, web-linking, and Wikipedia – assume that the information “out there” is available for anyone to take. This puts them profoundly at odds with educators and older adults who bluntly call this plagiarism.

A New York Times article (“Plagiarism Lines Blur for Students in Digital Age” August 2010) claims that the number of university students who believe that copying from the Web constitutes “serious cheating” is declining. The Western concept of intellectual property rights and singular authorship seems to be on the way out. It’s been with us for a while, anyway – since the Enlightenment, one could argue. So is this simply a natural waxing and waning of an ideal? Or is it the typical lack of discipline we tend to find in younger generations? Inevitable paradigm shift? Or “Generation Plagiarism” impatient for a fast-food, corner-cutting solution to the problem that writing is difficult and good writing takes time and practice? One university official in the article reported that a majority of plagiarism cases at his university involved students who knew perfectly well that they need to credit the writing of others. They copied intentionally, he said, knowing it was wrong. They were just unwilling to apply themselves to the writing process.

Or did they just recognize that it was wrong in the eyes of the old dudes in charge? In the same article, it was argued that notions of originality and authenticity have changed with the popularity of the “mash-up.” Student writing now tends to mimic the sampling, combining, and synthesizing that is more and more evident in music, TV shows, and YouTube videos.

Students themselves are less interested in creating a unique identity as young people were in the 1960s. Instead, they try on different personas like they try on different outfits, enabled by social networking and file-sharing technology. Borrowing freely from whatever’s out there, this might just be a new model young person: “If you are not so worried about presenting yourself as absolutely unique, then it’s O.K. if you say other people’s words, it’s O.K. if you say things you don’t believe, it’s O.K. if you write papers you couldn’t care less about because they accomplish the task, which is turning something in and getting a grade. And it’s O.K. if you put words out there without getting any credit.”

Of course, this kind of “us and them, either/or” talk is dramatic and thought-provoking, but it doesn’t really reflect what I’m thinking. Indeed, when it comes to technology in education, the polarized perspective is neither reasonable nor practical. The genie is already out of the bottle – the laptops are already in their hands at TAS. Indeed, the students in my classroom do not know a pre-MySpace/Napster era.

Now, for me, it’s about tapping into the potential of electronica to enhance education through its powers of motivation and customization and collaboration. AND balancing this with some old-fashioned discipline of the mental, social, and physical variety. Why? Because the gatekeepers are still the old dudes. The college admissions officers still care (somewhat) about school grades and SAT scores. But the revolution is coming and the students will be leading it. So, as Jeff Utecht commented during his TEDxTalk, I need to be socially networked, personal-learning-network-connected, or just frequenting Facebook, to see it coming!

Technology & Twisted Fiction: Does Bradbury’s twisted future have anything left to teach us?

I’m looking forward to helping Ray Bradbury exercise his dystopic demons again this year. I have been studying his Fahrenheit 451 with my students for the past six years and invariably he leaves them worrying and wondering, at least a little, about the state of our society.

This is the unit of work that our group of three Grade 8 English/Social Studies teachers has identified for enriching with technology to meet our Course 1 goals, and our teaching goals too, of course.

It amazes me, actually, that students engage at all with Bradbury’s metaphor-manic prose. As I explain to the students, whose eyes have just about rolled to the back of their heads by the end of page 2, this novel is really the author’s soapbox stand. It’s an urgent message he wants to get out to all of us about censorship and conformity, about misguided attempts to find happiness, about loss of humanity, about being entertained into mental oblivion … And about the impact of technology on society. He could have gone into politics, or the music industry, or taken to graffitiing important monuments with his message. But instead he wrote a book. I am pretty blunt – I don’t promise a compelling plot or rich characterization (after all, they have just been indulged with John Steinbeck’s Of Mice and Men). I tell them they will have to read Fahrenheit 451 as detectives. They need to wade through the clues of Bradbury’s literary puzzle, and take in the sprinkling of storyline on the side, to unearth the rich themes which are strikingly relevant today for a book that was written in the early 1950s about a future imagined in the 24th century.

For much of the story we see technology perverted to the whims of a faceless, nameless all-powerful government/media monopoly that seeks to maintain control through a devious mix of censorship and entertainment. These “authorities” have taken advantage of the citizens’ increasing laziness about wanting to know what’s going in the world, about wanting to truly learn about life, warts and all. Instead, people are increasingly satisfied to be passive, unquestioning recipients of a wave of empty factoids that are steadily beamed at them through interactive big-screen TVs that have substituted for walls in people’s homes, and also through the radio earbuds (iPod precursors?!) which they wear constantly, plugging up their ears to the point that citizens have learned to lip-read so as to avoid the necessity of dividing their attention between the gripping infotainment streamed directly to their ears and the people they live and work amongst (but with which they don’t really engage). Not only has technology enabled mental laziness, it also facilitates physical laziness by taking over basic daily functions, such as passing the breakfast toast from toaster to plate. It also comes into play when it’s time to pick up the pieces of the citizens’ sad, empty lives. Drug overdoses are so frequent, that doctors no longer attend this kind of emergency; instead, a machine has been invented to do the stomach-pumping and blood transfusions, and it requires only a machine operator to plug it in and turn it on. Citizens are both willingly and unwittingly giving up their autonomy to technology.

Bradbury’s futuristic society seems to be a sad and sorry mix of George Orwell’s 1984, in which the citizens are oppressed by their government, and Aldous Huxley’s Brave New World, in which citizens are oppressed through their addiction to amusement. It is all very oppressive!

Fortunately, the last stages of the book reveal an alternative future. One rebel has developed a two-way radio earbud so he can stay in contact with his new friend, the protagonist, and so together they can fight the forces of evil. And finally, at the end of the novel, our hero meets a group of outcasts who seem to have technology firmly in hand – quite literally, in the case of the televisions, which have been downsized into hand-held devices, handy for news updates but also easy to put down and walk away from.

This leads me to wonder how easy it is for us to walk away from technology these days. It is a question I put to our students and we have, for a number of years now, conducted our own social experiment with a voluntary technology fast – or, rather, e-media fast.

Every now and then, I read of other teachers and students pondering the same question. Just this month a high school in Portland, Oregon, attempted their own “fast” to prompt the same kind of self-reflection: “Students at Portland’s Lincoln High School unplug, experience life without technology”. Ironically, by switching off, kids are prompted to switch on their consciousness: “You don’t have to go live in the woods, but you have to be conscious about how you use these electronics,” one student reported in the article. “When people are educated about what they’re doing, that’s when they can make a personal decision to change.”

My own students have recounted widely varying personal responses. (We have tried to do it over a holiday period, because some level of laptop use would be required on a typical school day.) Some highly active kids just get on with their physical activity – no big deal. Highly social kids organize get-togethers in real time and space, instead of cyberspace – they plan sleepovers, bake cookies, and hang out together. For some, it takes organization and effort to stave off the boredom and disconnection they feel without technology. For others, it is surprisingly tough and they quit early. A small group invariably reports that they know they couldn’t get on with life without technology, and so what? They don’t have to contemplate a life without technology, anyway, so what’s the point? They are self-confessed addicts, but the addiction seems pretty harmless. Or is it?

Opening Up

“There’s nothing to writing. All you do is sit down at a typewriter and open a vein”

~Walter Wellesley “Red” Smith

Unfortunately, although I now have the relative luxury of my laptop and Microsoft Office Word 2007, I don’t feel that the writing is coming any easier. And Mr. Smith was making these dramatic claims as a sportswriter, after all. Isn’t that simply a matter of collecting clichés? Whereas I, on the other hand, am tasked with writing something worth reading on the subject of education and how we can turbo-charge it with digital power-tools.

I have tormented myself with inaction over this blogging assignment since it was first proposed by Jeff. The tech part I feel I can figure out by myself or with the help of a kindly middle school IT coordinator in exchange for extra play dates/babysitting for her young children. And the education part – no worries there. I love it. I got into teaching for the learning.

It’s the composing and exposing of myself that is excruciating for me.

William Wordsworth isn’t much help: “Fill your paper with the breathings of your heart,” he says. Thanks for that. It’s the echoing emptiness in my head that makes me so hyper-aware of my heartbeat anyway, and I’m already heavy-breathing (or maybe it would be classed as hyperventilating now) about the prospect of 8 blogs and a looming end-of-course deadline.

A word is not the same with one writer as with another. One tears it from his guts. The other pulls it out of his overcoat pocket.  ~Charles Peguy

That’s more like it. I picture my incredibly intelligent colleagues with fingers fairly flying over their keyboards as their reflections on teaching through technology enliven the blogosphere. Their wisdom – complete thoughts, rich sentences – sprout forth fully formed …Meanwhile, my process is more of the gut-wrenching, hara-kiri variety.

Surely writing would come more easily at this point if I felt I had anything of value to say, anything to add to the already loud chorus out there about digital tools and their potential for pushing us into a new paradigm for understanding teaching and learning. But why should I add my timorous voice to the bold banter of the expert bloggers out there?

Okay, so I understand the point of writing – on a personal level. I am a teacher of writing, after all. For years I have been explaining to the raised eyebrows in my classroom that we write to think. We write to give ourselves pause for thought. (A looooong pause, in my case.) We write to think more deeply. The writing process should be a process of personal discovery, of learning. To promote critical thinking and earnest self-analysis is the bottom line of our mission as teachers, right?

The new NETs for Teachers get to this, of course: “teachers should promote student reflection … to reveal and clarify students’ conceptual understanding and thinking, planning and creative processes” (1c). Can you guess which tid-bit I conveniently exchanged for an ellipsis?

And, yes, if it’s good for them, it’s good for me too. Of course, I’m going to want to distance myself from the type of teacher parodied in this YouTube clip: When I become a teacher YouTube Preview Image (which is pretty much a video version of the old, offensive adage: “Those who can, do; those who can’t, teach.”)

But I haven’t just been asked to write for me. I’m writing for my peers, and whoever else stumbles across me out there in cyberspace. This scares the pants off me. Now it’s a whole other ball game. We’re talking not only about writing to think, or writing as a dialogue between student and teacher, but writing to PUBLISH. And this is where I wrestle with the value of blogging for me. I have always considered that the role of a writer (a writer who seeks a public audience, that is, publication) is not to say what we can all say, but what we are unable to say. (Referentially and deferentially yours ~Anaïs Nin) If I don’t have anything new to bring, why bring it? I know a bit of blog cross-referencing and name-mentioning is a nice form of flattery for other bloggers, but that’s just Commenting, not Writing.

Shelly Blake-Plock, in his post on Why Teachers Should blog (from TeachPaperless blog) describes a poor wretch of a student in his education class who sounds a lot like me. The student complained that he had nothing to blog about because he had nothing to offer to advance the discussion. Blake-Plock says his student is wrong. To blog is to teach yourself to think, he says, and then goes on to admonish the student for over-thinking. Stop it, he says. Don’t think too much, just write. Just bare your brains out there. Be embarrassed. Be confronted by your own inadequacies, amplified exponentially by the very public nature of your idiocy. Pick yourself up, dust yourself off, have another go. Think again. Grow. And grow up. That’s the point of the exercise.

But is failure a prerequisite for greater understanding? Must that failure be on a grand scale, in front of a broad readership, which includes the people I have to front up to work with tomorrow?

Must we struggle, must we fail, must we do it spectacularly and in public?

Haven’t some significant contributors to human progress done so without epic failure or even initial embarrassment? Have their endeavors always been collaborative? Haven’t they had space to think/invent/compose in private? Or at least to determine their audience – who the great work is revealed to, and when? Do I owe myself and/or my students some space to do the same? Or should we be forced into the full glare of the online world on somebody else’s timetable?

Blake-Pock declares that it is teachers who have “the power to teach a generation that to fully live and to fully know one’s self is to fully live and to fully know one’s self in the public conversation.” Wow! Is public success and failure the only kind that really counts? Is there no personal, private triumph? (Heck, I’m feeling a small victory coming on as I near the end of this post!) Just because we can – because we have this vehicle, this world wide web, to reach across the globe – is there an imperative to do so?

Can’t we just read our history (and an expert’s top 10 list of expert blogs on the subject) and learn from the mistakes of others?

(Please?)