Archive | The Internet RSS feed for this section

To Read Well on Screens, Change Your Mindset

15 Mar

I’ve written the following post to accompany (and extend) my talk on “Cultivating a Digital Reading Mindset in First-Year Composition” that I’ll be giving this Friday at the 2017 Conference on College Composition and Communication in Portland, Oregon (Session I.38).

While I’m primarily directing this toward an audience of college instructors, I hope that teachers at all levels—as well as anyone interested in the differences between print and screen reading and how to become better at the latter—will find something useful in it. 

To find your way to any of the sources cited below, as well as to a number of other articles, studies, and books on this subject, I encourage you to visit this annotated bibliography of sources on digital reading that is available on the websites of UC Berkeley’s College Writing Programs and UCB’s Center for Teaching and Learning.  (P.S. Thanks to Jason B. Jones of Trinity College for this shout on the ProfHacker blog of The Chronicle of Higher Education.)

Facing the Screen Honestly

My appeal to you today depends, in part, on how you feel about reading on screens and how you approach that subject with your students, especially those students who are early in their college careers:

Are you one who asks (or allows) your students to read many, or most, of the texts for your course in digital form, both on- and offline?

Or do you tend to require your students to read most, or perhaps all, of the texts for your course in printed form?

If you are in the first group, I ask you to consider how much time you spend in your class explicitly addressing the practice of reading on screens. If the answer to that question is little to none, or if you don’t see a significant difference between reading on screens and reading in print, I’d ask you to reconsider those positions.

If you’re in the second group, and you focus largely on reading in print with little focus on reading on screens, I’d ask that you reconsider that position as well.

Our students, and we, are reading more and more texts—for school, for work, for pleasure—on screens, and so it behooves us as teachers to squarely face that reality. At the risk of sounding overdramatic, I believe our students’ futures and our country’s future depend on us doing so.

“A Place of Apprehension Rather Than Comprehension”

One of the preeminent scholars on reading, Maryanne Wolf, the author of Proust and the Squid: The Story and Science of the Reading Brain, puts it well when she says of our understanding of digital reading: “We’re in a place of apprehension rather than comprehension” (qtd. in Konnikova). Indeed, though there continues to be much study of both digital and print reading, there has also been considerable anxiety about the decline of print and the effects that impaired reading could have on each of us and on our society as a whole, especially as most of us dive deeper into real-time, ever-shifting experiments on the effects of digital devices.

In his book, The Gutenberg Elegies, published in 1994 at the dawn of the Internet era, Sven Birkerts noted “some of the kinds of developments we might watch for as our ‘proto-electronic’ era yields to an all-electronic future,” each of which anticipates troubling aspects of our situation two decades later: 

  1. Language erosion
  2. Flattening of historical perspectives
  3. The waning of the private self (128-130)

Warnings like Birkerts’ are worth paying attention to. Also, as we know, when any new medium gains traction, anxieties—some well-founded, some not—tend to abound, whether we’re talking about previous debates over the effects of television in the 20th Century; or the growing popularity of novels, newspapers, and magazines in the 19th; or even further back to Socrates worrying in Plato’s Phaedrus that if writing were to replace the oral tradition, it would “implant forgetfulness in [men’s] souls.”

In our present era, these anxieties sometimes break down into a kind of computer binary where we’re faced with an unsatisfactory choice between either print or digital texts. I’ll turn to an excerpt from a 2012 speech at the Nashville library by Margaret Atwood, she of the brilliant dystopian novels (and pithy tweets), for a useful challenge to that binary:


Margaret Atwood (Source: Poetry Foundation)

In [reading texts in] short form, [digital tools offer the virtues of] speed and ubiquity for small narrative bites. [Then in] long form, people split into three groups:

Number one: ‘I wouldn’t read online if you held a gun to my head.’

Number three: ‘You’re a troglodyte and live in a cave unless you tear up all your paper books and do nothing but read online.’

And most people are in the middle, and they say, ‘We want both.’ “ 

Right? So, given that reality, what’s a teacher of reading to do?

What We “Know”:

Differences in Print and Screen Reading

Now, reading on screens can mean a lot of different things, and there are a whole series of questions to take into account as we evaluate the differences between print and screen reading:

–Are we online or offline?

–What kind of device are we using, and how are we using it?

–Is the WiFi (and the possibility of distracting notifications) on or off?

–Is the screen backlit or does it employ the e-ink of an e-reader?

–What kind of text are we talking about: a lengthy novel that is native to print? A PDF of an academic article? An online article (like this one) full of hyperlinks and, perhaps, ringed by advertisements?

And so forth and so on. These are important distinctions, and I’ll try to make clear below which types of screen reading situations I’m referring to. Also, the multifarious ways of reading on screens point to just how complex an issue this is.

(And all of this is to say nothing of the rich new possibilities for reading, whether for critical reading or for pleasure, offered by digital texts, a subject that is beyond the scope of my argument here. Among the many, many things that have been written about that richness and about the comparisons of print and digital reading in this regard, I recommend taking a look at the following articles by N. Katherine Hayles and Paul LaFarge.)

However, after acknowledging those complications (and benefits!) of screen reading, there are some generalizations we can make.

Words Onscreen

(Source: Amazon)

Among the most comprehensive books on this subject is Naomi Baron’s Words Onscreen: The Fate of Reading in a Digital World, which covers the history of the development of reading, details multiple studies on the subject, examines the screen and print reading landscape, and, importantly, includes her own surveys of student attitudes towards reading in the two media (more on those attitudes in a moment).

Early in her book, Baron notes the following:

“For over two decades, psychologists and reading specialists have been comparing how we read on screens versus in print. Studies have probed everything from proofreading skills and reading speed to comprehension and eye movement. Nearly all recent investigations are reporting essentially no differences” (12).

However, as Baron points out later in her book (and as she elaborated in a subsequent email exchange with me), these findings rely on laboratory conditions directly comparing screen and print reading that don’t fully capture the way we tend to read: “The investigations involve relatively brief readings followed by some version of comprehension or memory questions. What we don’t have—but sorely need—are data on what happens when people are asked to do close reading of continuous text….onscreen versus in print” (171).

However, if we consider our typical, everyday practices alongside the findings of other studies of screen reading—particularly of the way we tend to read on devices with connections to the Internet—I think we can agree that there are some strong indications that when we’re reading on most digital devices (think computers and smart phones especially) as compared to reading in print we tend to:

–be more easily distracted

–experience eye fatigue from back-lit screens

–be less inclined to read deeply than we might in print

–have less memory of and less comprehension of what we read

–have a harder time getting a holistic sense of the text

–have a lower tolerance for longer texts [reflected in the text-speak acronym TL;DR (Too long; didn’t read)]

Certainly this is the reporting of a substantial majority my own students, whom I ask every semester to reflect on the way they read in different media, and this tracks with the surveys of students both in the U.S. and abroad that Naomi Baron has conducted and reports in her book. In the work they do for school, students tend to associate reading in print with better learning outcomes.

Same Skills, Different Medium? Not Exactly

The question arises whether we need to continue to teach traditional literacies associated with print and then help students transfer those skills to the digital realm OR whether we ought to focus on cultivating the different kinds of literacies that reading on screens requires. The research of Julie Coiro of the University of Rhode Island suggests that the answer to both halves of that question is likely “Yes.”

For instance, the findings of one 2011 study that she conducted of the online reading comprehension of a group of seventh graders “support other research that suggests the processes skilled readers use to comprehend online text are both similar to and more complex than what previous research suggests is required to comprehend offline informational text” (Coiro 370). At the same time, the study also found positive correlations between offline and online reading comprehension. 

Coiro also saw indications that students needed to have proficiency in newer, online reading skills (e.g. online searching, web page evaluation, negotiation of hyperlinks, etc.) before they could apply some of the skills traditionally associated with successful reading of printed text. Further, the results of the study indicated that “topic-specific prior knowledge” was important to successful reading comprehension for those students who had less skill in online reading, while this prior knowledge was less significant to the reading comprehension of students who had “average or high levels of online reading skills” (Coiro 374).

Coiro is assiduous in listing a series of qualifications for these findings, noting that there are a variety of possible interpretations, and that competency in “online reading” can mean a great many things dependent on the task and the assessment; she calls for further research. However, Coiro’s careful work is among the studies suggesting that students likely need distinct training in how to navigate, say, a book versus a web site in order for the critical reading skills applied to the former to be used fruitfully in the latter.

As Coiro has said elsewhere, “In reading on paper, you may have to monitor yourself once, to actually pick up the book….On the Internet, that monitoring and self-regulation cycle happens again and again” (qtd. in Konnikova). Teaching our students (and, again, ourselves) how to be better self-regulators is crucial to our success as screen readers—especially when we’re online.

Attitude and Belief as Self-Fulfilling Prophesy

As I said above, when I survey my students about the way they read for school and for pleasure, their replies track pretty closely with what Baron has found in the surveys of college students that she has conducted. When they read on screens, students tend to like:

–the convenience of being able to search within a text more easily and the ability to look up clarifying information online while they read;

–the portability and perceived environmental friendliness of digital texts (the latter is a debate for another time);

–and the lower economic cost compared to print.

Meanwhile, they tend to report that when they read printed texts, they:

–are more likely to re-read and to understand the text;

–have an easier time taking notes;

–are better able to focus;

–and are less likely to try to multitask.

Based on these pretty common reports by students, this would seem to indicate that they have found they perform demonstrably better as readers in school settings when they read in print. But is it so?

Last spring, I asked a group of 33 students in my reading and composition courses to complete a relatively simple assignment. I had them read Nicholas Carr’s 2008 article, “Is Google Making Us Stupid?” (Later expanded into his 2010 book, The Shallows: What the Internet is Doing to Our Brains.) I told them they could read the online version on The Atlantic’s web site, download the PDF via academic databases, or print out the article in either form, according to their preference. I then asked them to write a two- to three-page summary of the article, followed by a one-page response with their initial assessments of Carr’s arguments. Once they turned in the final draft, I also asked them to tell me in what form they had read the article and what they noticed about the way they read as they worked on the assignment.

Of the 33 students, 13 immediately printed the article and worked with that version, 11 read it entirely online, and the other 9 students did some mixture of online, PDF, or print reading that made it tougher to discern which medium they had used most prominently. Once I had graded the assignment, I looked at the scores of the purely online and print readers, and here’s what emerged:

Of the 13 who printed the article, the average grade on the paper was about a B-minus. (This group tended to include my weaker readers and writers.)

Of the 11 who read Carr’s article online, the average grade was an A-minus. (This group generally included the stronger readers and writers, as the grades would indicate.)

While acknowledging that this sample size isn’t remotely statistically significant, I find this result intriguing and wondered at the time how things had turned out this way—both what dictated the students’ choices of medium as well as how the respective groups performed on the assignment.

The Shallows

(Source: NPR)

A number of studies of students’ ways of approaching digital and paper texts provide a partial explanation of what may be going on.

In a study of Israeli college students, cognitive scientists Rakefet Ackerman and Morris Goldsmith compared the way the students performed on multiple-choice tests after reading two short texts:  one printed and one a Microsoft Word file on a computer screen. In one experiment, the students were given a fixed amount of time to study, and performed about the same in their reading comprehension of the two types of texts. In a second experiment in which the students decided for themselves how long to study, the students performed much better on the test connected to reading on paper than they did on the test of their on-screen reading.

Ackerman and Goldsmith suggest that these results potentially point less to differences in the two media used for reading than they do to the metacognitive processes being employed by the students. In short, if readers perceive that screen reading can be employed for “effortful learning” in much the same way as they tend to perceive reading in print, then they may be able to self-regulate their reading as effectively on screen as they do on paper, at least so far as simple text (as opposed to hypertext) reading is concerned.

Bringing effort, or not, is partly down to mindset, of course. In a 2001 study that Baron briefly references in her book, a group of college students was examined to see how they performed when they read printed texts “for study purposes or for entertainment.” Unsurprisingly, “Students reading in study mode were better at making inferences, generating paraphrases, and remembering the text’s content” (Baron 161). This makes perfect sense. Now if we consider that students use their laptops not only for work but also for gaming or to watch funny YouTube channels or to video chat with friends, and that they turn to their smart phones all the time for Snapchatting, Instragramming, texting and the like, we can easily see the association between these devices and entertainment, an enticing potential distraction that is ever present when the student turns to those same devices to try to read in more than a cursory way. To engage in the effortful work of reading, that constant self-monitoring that Coiro speaks of comes into play.

A 2015 study conducted by German researcher Johannes Naumann helps further this point. Naumann examined the data from a 2009 OECD PISA Digital Reading Assessment of over 29,000 high school students from 17 different countries. He examined how students navigated through hyperlinked pages to complete various information-seeking tasks, some of which required that students visit only one or a few pages in order to be successful and some of which required the negotiation of multiple pages.

Those students who were more accustomed to seeking information online had more success at navigating and at completing the tasks successfully and efficiently (i.e. not spending a lot of time clicking irrelevant links) than did those who were accustomed to using online spaces mostly as places for social engagement. These differences in success were more pronounced the more difficult the task became. Further, the students with greater print reading skill tended to perform better on these assessments.

His findings suggest the importance of both experience with navigating online text for more than social purposes and to the importance of having a mindset of “information engagement” for a student to have greater likelihood of success as a strong online reader. (For more on these and other studies, see the annotated bibliography I mentioned at the beginning of this post.)

And Now a Word from a Successful Online Reader

A final note on that assignment I asked my own students to do last year. One of my students provided a glimpse of some of what might have been going on for her and the other successful online readers in my class in the reflection she wrote afterward:

“I ended up reading [Carr’s article] online….I read during my research…that we can condition our mind not to be affected by the fact that we are reading online. In a sense, that if we approach…[what] we are reading [in this way], we can get as much out of it as if we [were] read[ing it] in print. So I wanted to try. I told myself that I was [going to] focus on the reading, and read it as if I was doing so in print so that I [could write] an accurate summary….”

After talking about her process a bit, she went on:

“It took me much longer just to plot…[Carr’s] main ideas, and even when I thought I had them, I still had to go back to the article continuously. I did not make annotations [on the text itself], which I [always] do [when I read] in print. But it was interesting to see…what changed when I approach[ed] an online article with the mindset that I [was] reading it [as if it were in] print. I definitely know now that no matter what, reading [in print works] better for me when doing assignments [like this one].”

That’s the kind of self-awareness I’d love all of our students to generate: not that they must read in one or the other medium to achieve a certain result, but to recognize their own best practices and to read mindfully, no matter the medium.

You’ll be completely unsurprised to learn that she earned an A on her paper.

Suggested Steps for Helping Students to Read Well on Screens

“Enough already! What do I do now?” you fellow teachers of critical reading may be wondering. I’m wondering too. The best approach to reading on screens is hardly a settled question, and will no doubt continue to change as the technology shifts around us. However, I do have a few suggestions, and would love to hear yours too.

First, I direct you to the aforementioned pages on this subject that my Berkeley colleague, Donnett Flash, and I put together. Also, here’s a quick (and evolving) handout (Read Well On Screens and Prosper) I’ve been giving to all my students, and a quicker list version I’ve given to fellow faculty: (Digital Reading Suggestions for Teachers).

For my purposes today, however, I’ll emphasize a few things:

*Help students to cultivate a screen reading mindset that they’ve got to bring effort, and effort of particular types, to be able to read successfully on digital devices. Perhaps most prominent among these practices is that they need to reduce distractions as much as possible and resist the medium’s associations with speed, efficiency, TL;DR, and entertainment. Power browsing, skimming, scrolling, and reading for gist are useful but they aren’t everything, and neither should they be the only thing.

*Create self-reflective screen reading assignments that will help students to be more self-aware and to identify their own best practices.

*Think carefully about how you’ll employ digital readings in and outside the classroom. Particularly with younger readers, do not substitute digital readings for print if you don’t plan on addressing the differences between them.

*Discuss, model, and reinforce screen reading skills explicitly.

*Identify and advocate for new technologies and practices that will deepen screen reading skills. And then tell me what they are.

*Learn from your students. Students are already discovering the technologies and approaches that help them to concentrate when they’re in “study mode.” Many of my own students employ ad blockers when they are online, or use applications like Be Focused Pro and Self Control to prevent themselves from getting distracted. They are mindful of how they need to arrange themselves physically—and how far away certain of their devices need to be—to be able to read well. Keep a record of these technologies and practices, and mark the best ones.

Reading is Fundamental:  Toward Democratization and not Gentrification

I hardly have to preach to you, the converted, on the virtues of reading generally and the importance of helping students to read well on screens in particular, but I’m going to do it anyway.

We’ve all seen students coming into our classrooms with different, sometimes radically different, levels of preparation for college-level reading and writing. This often maps onto larger societal divisions between the haves and have-nots:  those who come from wealthier communities with better equipped schools and those who do not; those who have long training and practice at reading and those who do not; those who have better access to and familiarity with digital technologies and those who do not. College is a place where we try to help close these gaps and reduce the inequality that plagues our country.

Focusing on reading well is ever more important now in a time of rising inequality, of difficulty at discerning real news from fake, and of divisiveness in a political landscape in which empathy is on the decline and misunderstanding is ascendant.

Teaching students of all backgrounds to read well on screens is a democratizing practice. It ensures that such pursuits—and the benefits that flow from them—don’t turn reading into a gentrified activity whose advantages are available only to a privileged few as we spend more and more of our time reading (and talking and watching and living) on screens, cordoned off in shrinking electronic worlds of our own making.

In this same vein, I’ll let the novelist Caleb Crain have the last word here as he reminds us of what is at stake:

“[T]he N.E.A. reports that readers are more likely than non-readers to play sports, exercise, visit art museums, attend theatre, paint, go to music events, take photographs…volunteer….[and] vote. Perhaps readers venture so readily outside because what they experience in solitude gives them confidence. Perhaps reading is a prototype of independence….Such a habit might be quite dangerous for a democracy to lose.”


Works Cited

One more time: For an annotated bibliography about these and other sources on digital/screen reading, I direct you here. In the meantime, here’s where to find the sources discussed in this post.

Ackerman, R., & Goldsmith, M. “Metacognitive regulation of text learning: On screen versus on paper.” Journal of Experimental Psychology: Applied, 17:1 (2011), 18-32. Web.

Atwood, Margaret. Nashville Public Library Literary Award Winner 2012. Video. You Tube. Nashville Library. 4 Nov. 2012. Web.

Baron, Naomi. Words Onscreen:  The Fate of Reading in a Digital World. New York:  Oxford UP, 2015. Print.

Birkerts, Sven. The Gutenberg Elegies: The Fate of Reading in an Electronic Age. New York: Fawcett Columbine, 1994. Print.

Carr, Nicholas. “Is Google Making Us Stupid?” The Atlantic. The Atlantic. 1 July 2008. Web.

—. The Shallows: What the Internet is Doing to Our Brains. New York: Norton, 2010. Print.

Coiro, Julie. “Predicting Reading Comprehension on the Internet:  Contributions of Offline Reading Skills, Online Reading Skills, and Prior Knowledge.” Journal of Literacy Research 43.4 (2011):  352-392.  Sage Publications. Web.

Crain, Caleb. “Twilight of the Books.” The New Yorker. 24 Dec. 2007. Web.

Hayles, N. Katherine. “How We Read: Close, Hyper, Machine.” ADE Bulletin. 150 (2010): 62-79. Web.

Konnikova, Maria. “Being a Better Online reader.” The New Yorker. 16 July 2014. Web. 

La Farge, Paul. “The Deep Space of Digital Reading.” Nautilus. 7 Jan 2016. Web.

Naumann, Johannes. “A Model of Online Reading Engagement:  Linking engagement, navigation, and performance in digital reading.” Computers in Human Behavior. 53 (2015):  263-277.

Wolf, Maryanne.  Proust and the Squid. New York:  Harper Collins, 2007. Print.



Fight fake news waste by asking W.A.I.S.T.: “Why Am I Sharing This?”

27 Feb

[I’ve prepared the following post to go with a brief talk I’m giving on March 1 at UC Berkeley’s Academic Innovation Studio for a panel discussion on “Beyond Hype, Hysteria, and Headlines:  Strategies to Address Media Literacy Gaps in the Classroom.”]

Shortly after she became a grandmother, the writer Anne Lamott came up with a brilliant acronym as a reminder for herself. Any time Lamott felt the urge to offer unsolicited (and probably unwelcome) parenting advice to her son and daughter-in-law she would try to stop herself by thinking:  


Which stands for “Why am I talking?”

The acronym works beautifully in its two-fold way by first reminding one to pause and then to think before speaking. Though she intended it for her own use, and subsequently as a suggestion for other grandparents, I’ve found it helpful to employ as a parent of a pre-teen as well as in meetings at work and in conversations with friends and loved ones.

Inspired by Lamott, and knowing I wanted my undergraduate students at Berkeley to do a little thinking about the “fake news” that’s much discussed in the real news these days, I came up with an assignment for them centered around another acronym:


Which stands for “Why Am I Sharing This?”


Guiding Students Through the Morass of Fakery and “Fakery”

As many media observers have noted, fake news has become a kind of catch-all term in the past few months to harness information reported and shared online that is entirely or partially made-up, and which could constitute anything from straight-up propaganda to biased half-truths to cynical attempts to profit from people’s mindless, emotional clicking to sharp satire and on and on until we get to meta moments like this piece of performance art when earnest news commentators are rightfully sussing out fake news purveyors who, in their turn, prove the point of how easy it is for fake news to go viral.

And that’s to say nothing about the most troubling category of fake news:  the news that the current president categorizes as “fake” not because it’s not factual or well-sourced but because he doesn’t like the story.

Into this breach stepped the students in my second-semester freshman reading, composition, and research class. As in thousands of other college classes across the country, my students always learn to practice careful critical assessment of source material. This is essential not only to their work while in college, but also to their future as engaged, productive citizens. Since my current group is studying issues around the Internet, social media, and the human-machine interface generally, it was a natural fit to ask them to spend some time discussing and then evaluating some of the news they themselves were inclined to share in online spaces.

We first discussed what they took “fake news” to mean, and looked at some samples of things that were flat-out lies, such as this meme that was widely circulated in many forms and on many social media sites prior to the election last November:


Source: Snopes

As has been widely reported, Donald Trump never said this. It didn’t stop the meme, even after it was debunked, from being shared again and again on social media by Trump’s detractors who desperately wanted it to be true. As my students noted, it’s difficult to figure out who originated many memes, but whoever created this one did a bang-up job of trying to capture a Trumpian tone to the quote that made it seem potentially real.

My class also looked at this bit of fake news about President Obama supposedly banning the Pledge of Allegiance in schools that stirred up predictable outrage amongst those who didn’t like him. Those same people presumably ignored whatever internal “Really?” voices they possessed (not to mention the funky URL with the extra “co”) and instead liked, shared, or otherwise reacted to this fake post on Facebook some 2.2 million times as of mid-November:


Source: Refinery29


Among the key things to note about these two samples of fake news are that each of them went viral because of the emotions they stirred among Trump’s and Obama’s detractors, respectively. Many people who were pleased to see confirmation of their own biases, and who were spurred on by the controlling fervor of their emotions and the satisfying immediacy of the social media interface, shared and reacted to this fakery quickly and without much thought.

Clearly it’s time we all slowed ourselves down a bit.


The Assignment

After the above discussion, I asked my students to read this excellent article by the journalist Brooke Berel that usefully frames in the problem of fake news, provides historical context and helpful links (including this list of fake news sites from Melissa Zimdars of Merrimack College), and points to steps that journalists, tech companies, and we, the public, can take to ensure that fake news doesn’t metastasize into a permanent condition.

My assignment, which you can see here (why-am-i-sharing-this) was a simple one.

I asked my students to pick two items from their social media feeds that they’d be likely to share with others (or which indeed they had already shared) without too much thought. Whether a joke or apparently serious news, whether an article or a meme or a video or a GIF, the two pieces they picked had to have some facts or claims that would need to be verified. Then I asked them to write up their assessments of each piece in five steps:

  1. Describe what it is.
  2. Explain why you’d be inclined to share it quickly.
  3. Identify and carefully evaluate the source:  its origins, its credibility, its biases, its truth.
  4. Consider your audience:  how would they react?  what benefits or problems might arise from sharing this post so quickly?
  5. And then, knowing what you know now, would you still share the post, and what have you learned by doing this exercise?

The students picked a wide range of material and revealed themselves to be astute evaluators of truth on social media (contrary to what some studies of their generation have found), readily identifying clickbait and things that would stir social media users in particular ways. However, there was also evidence that some of them needed more help in assessing not only the factuality of material but also the context in which that material was situated and how that shaped the message. This was to be expected–most of my students are college freshmen, and people of any age can make these types of mistakes. But it also pointed up a key thing to consider about the problem of fake news:  in school assignments, students make these mistakes even when they are keyed to think and act more carefully and critically. In everyday discourse on social media, where speed and emotion are king, what chance does the truth have?

Another thing that stood out in the students’ responses was their acute awareness of their immediate audience on social media:  Not only (or even primarily) “Is this post true?” but “Will my followers and friends like it?” In doing so, they seemed to enact their own versions of this graphic representing “How to be unannoying on Facebook”:


Source: WaitButWhy


Intuition and the Mind:  

A Few Obvious and Yet Important Takeaways for Fighting Fake News

After the students had completed the assignment, we talked about some of the things they thought we all should do to better evaluate news and “news” that is shared on social media. Working in small groups, they each came up with lots of good suggestions for making classic critical moves of evaluating authors, media outlets, and interest groups; cross-checking facts; following trails to original sources; being aware of confirmation bias; sniffing out doctored images and sketchy web site design; relying on fact-checking sites and on distinguishing credible sources from non-credible ones, and so forth. I was pleased to see them suggesting many of the same steps recommended by the UCB Library when it issued a guide to identifying fake news a couple of weeks later.

An amusing contrast emerged when one group listed the advice to “Use your mind!” when on social media, while across the room another group advised people to “Use your intuition.” While the assignment and our debriefing of it were largely focused on doing the former, the latter was indeed an important step in the process of assessing an item’s veracity. It just shouldn’t be the only one.

So, distilling the students’ very concrete findings and that mingling of mind and intuition, I come to the following steps for asking and enacting W.A.I.S.T. (“Why Am I Sharing This?”) not just in scholarly work but, especially, in everyday social media use, enabling a pause to think before sharing, liking, or reacting in whatever way:

*The more emotional you are, the more you should pause.

*If it’s not a credible source you know you can trust, you should pause.

*If the piece speaks deeply to your own biases, you should pause. Twice.

*If liking/sharing/reacting is mostly about making yourself feel better, you should pause. (See Venn Diagram above.)

*If you haven’t carefully considered how your audience will react to the news, you should pause.

*Do you want the site you’re sharing or liking to make money from your sharing and liking? No? Then you should pause.

*Have you actually read or watched the thing you’re sharing? No? You know what to do.

You get the idea. Like Anne Lamott, the grandmother aspiring to restraint, we all need to take a moment to WAIT before reacting, then ask WAIST, and then, and only then, should we act. Remember that most of the time on social media we are not in read-and-react situations of imminent danger:  we are not in a war zone, on a dark street corner at night, or about to be sacked by a 300-pound lineman. Most of us are sitting on our couch at home, getting stirred up by the little machine in our hand.

Speaking of machines…


…A Coda from E.M. Forster


For several years, I’ve had students in this same class read E.M. Forster’s prescient short story, “The Machine Stops,” which was first published in 1909. (I’ve previously written about Forster’s story and its relevance for our time elsewhere on my blog.) In the story, the people of a futuristic society live entirely underground and rarely move or leave their hive-like “cells,” such that their bodies have become “lump[s] of flesh.” The most common activity is sitting alone in one’s cell before a screen, remotely connected by a central, god-like Machine to thousands of other residents, all of them incessantly listening to and giving lectures on various “ideas” that aren’t ideas at all.

At one point in the story, an influential lecturer whose area of expertise is the French Revolution gives some advice to his sedentary, isolated listeners that captures the ethos of the soon-to-perish Machine civilization. Like so much of Forster’s story, the lecturer’s absurd advice to his somnolent audience offers us an all-too-relevant warning for our own time:

“Beware of first- hand ideas!”….“Let your ideas be second-hand, and if possible tenth-hand, for then they will be far removed from that disturbing element—direct observation. …You who listen to me are in a better position to judge about the French Revolution than I am. Your descendants will be even in a better position than you, for they will learn what you think I think, and yet another intermediate will be added to the chain. And in time…there will come a generation that has got beyond facts…a generation…which will see the French Revolution not as it happened, nor as they would like it to have happened, but as it would have happened, had it taken place in the days of the Machine.”








Digital Reading: What do we “know” and where do we go #next?

7 Apr


Wondering about the differences between traditional print reading and digital reading (or e-reading), and how it might affect not only your own reading but the way you need to teach critical reading to your students?

I was, so a UC Berkeley colleague and I did a little research and thinking about the issue, and put together the following pages for UC Berkeley’s Center for Teaching and Learning.

(Also an earlier, brief Prezi we put together for an on-campus talk about the issue can be found here.)

Have a look, and feel free to offer suggestions. 

“Only Connect”

6 Feb

Earlier this week, I was standing at a bus stop in busy downtown Berkeley, waiting to ride home after work. An email came in on my smart phone from one of my students who had a question regarding an essay he was writing about E.M. Forster’s speculative 1909 short story, “The Machine Stops.”

E.M. Forster, by Dora Carrington (Source:

E.M. Forster, by Dora Carrington (Source:

I looked up from my phone and briefly locked eyes with a gaunt, ragged-looking man who was walking past. He had all the markers of the hardcore homeless who, sadly, are all too familiar in Berkeley:  matted hair, torn clothing, dirt covering him from head to toe. Given his appearance, I assumed he was likely drug-addicted or schizophrenic, or both.

An extended gaze exchanged with someone like this rarely goes well; I looked back down at my phone to consider my student’s email.

Suddenly, a grimy palm was thrust between me and the phone, inches from my face. A faint scent of decaying garbage.

I recoiled, and there the man was, almost shoulder to shoulder with me as I leaned against a brick wall. The wild eyes seeing me, or not. He was muttering. I waited.  

“Can I tell you something?” he finally asked in a faint voice.


He muttered again, almost as if praying. What I could hear sounded like gibberish. I waited.

Then, with a violent motion, he chopped his hand against my phone and sent it clattering to the sidewalk.

“Turn it off!” he shouted, and then continued in unintelligible fashion, only now more loudly and inches from my face.

“OK, OK,” I said as calmly as I could, then bent to pick up my phone and started walking up the street away from the bus stop. Away from him.

“You will be executed!” he offered as benediction and then stalked off.

I circled back to the bus stop, wondering how many of the people there had watched and heard this exchange. It was hard to tell:  none of them looked at me. Most of them were looking at their phones.

Kuno Comes to Berkeley

In “The Machine Stops,” Forster’s narrator tells of a futuristic world in which the people are willingly in the grip of an all-controlling Machine that was created by humankind generations before. Each person now lives alone underground in windowless rooms that are honeycombed together “like the cell[s] of a bee.” Physical touch between people is considered repellent, in-person meetings rare. People are entirely disconnected from Nature and from each other, communicating via screens, delivering empty lectures, having things brought to them by the Machine, their minds and muscles atrophying. When the main character, Vashti, is first introduced to us, it is not as a woman but as a “swaddled lump of flesh…with a face as white as a fungus.”

My students frequently make connections to movies like Wall-E and The Matrix, and it’s also an easy leap to see the way Forster imagined us all a century later Skyping and Facebooking and You Tubing and ordering packages from Amazon by drone, as in this description of Vashti’s small room:

“Then she generated the light, and the sight of her room, flooded with radiance and studded with electric buttons, revived her. There were buttons and switches everywhere – buttons to call for food for music, for clothing. There was the hot-bath button, by pressure of which a basin of (imitation) marble rose out of the floor, filled to the brim with a warm deodorized liquid. There was the cold-bath button. There was the button that produced literature. and there were of course the buttons by which she communicated with her friends. The room, though it contained nothing, was in touch with all that she cared for in the world.”

The only one in the story who resists The Machine and its dictates is Vashti’s son, Kuno. He begs his mother to journey across the earth to see him so he can speak to her and see her face-to-face and “not through the wearisome Machine.” He thinks and asks questions. He longs to visit the forbidden surface of the Earth and to exercise his body and to look at the sky and wonder at the stars, all of which he does before The Machine violently tugs him back beneath the ground. Kuno is the only one who foresees The Machine society’s cataclysmic end.

Kuno is the voice of reason in the story, the only one to resist the absurdity and tyranny of The Machine. Kuno’s is Forster’s voice, and ours.

And for this, Kuno and his like–the rational, the physical, the emotional, the sensual, the non-mechanical people, the ones who can see the truth–are outcasts in the society of The Machine, flung to the surface of the Earth to die, and assigned the status most feared by Vashti and her “friends”:


I boarded the bus for home and looked down at my phone, its screen streaked with oil from the man’s hand, from mine. I wondered if I had dared not to avert my eyes from him to look at my phone but instead had held his gaze and smiled or given a friendly nod whether his response would have been different. Or perhaps he would have raved at me regardless.

How many people must shun this man, minute by minute, every single day of his life? Imagine the cumulative effect of that loneliness and that rejection by one’s fellow human beings.

In his own way, whatever the biochemistry of his brain was doing to thwart his efforts, the man was looking to make contact, and I had instead responded in a way that was perfectly normal, perfectly acceptable in polite society (“perfectly mechanical,” Vashti would say), and instead had turned to my phone. I had, as the characters in the story do, “isolated” myself.

His enraged madman’s response to me was, in the end, perfectly rational. It’s a less-polite version of the same lament so many of us regularly have about others and, if we’re being honest, about ourselves, even as we can’t resist the pull of the flashing notifications, the desire to see what the web, Forster’s evolved Machine, has delivered to us.

Indeed, the man’s screaming was a crazed echo of another of Forster’s creations, Margaret Schlegel of Howard’s End who wants to implore the rigid, unemotional Henry Wilcox with one of Forster’s most famous entreaties, one that rings down the decades, louder and more urgently now than ever, if we’ll stop long enough to hear it:

“Only connect!”


Monument to E.M. Forster in Stevenage, Hertfordshire.

Monument to E.M. Forster in Stevenage, Hertfordshire.






Students going multimodal

15 Dec

In my first-year reading and composition course at UC Berkeley this semester (“Writing in Public:  Identity and the Digital You”), my students read a series of pieces that asked them to think about digital technologies and the ways they are affecting our lives.

Among the things they read and wrote about: 

*It’s Complicated:  The Social Lives of Networked Teens, by danah boyd

*The PBS Frontline documentary, “Generation Like”

*Clive Thompson’s book, Smarter Than You Think:  How Technology is Changing Our Minds for the Better

*Philip K. Dick’s famous futuristic novel, Do Androids Dream of Electric Sheep?

Do Androids Dream of Electric Sheep?

They wrote in the traditional modes of freshman composition–rhetorical and literary analyses, persuasive papers and summaries, reader responses and the like. Then, after they read Thompson’s descriptions of students who had experienced the beneficial effects of writing online for an audience of more than one (the teacher), I asked them at the end of the term to write a short, multimodal essay and to post it online so that anyone–possibly you–could see it.

The class issued a collective gulp.

Exhausted after a long semester but game for the challenge of writing in a way almost none of them had tried before, they wrote and shared their essays via WordPress, Tumblr, and Prezi. I’ve linked below to some of them, organized loosely into two categories. I hope you’ll take the time to explore a few of them.

Social media and its (dis?) contents

Starting with a little historical context, Inger draws comparisons between Facebook profiles and illuminated texts from the Middle Ages.

Alex argues social media might help us bridge the gaps between our intrinsic and perceived identities, and Sierra tells us about how social media provides her with a “second home” as she moves from Korea to Canada to the Philippines and then on to the U.S. 

Exposing part of social media’s less savory side, one student looks at how it pressures people to change their appearance, while another, Chantelle, explores in particular the way this affects girls’ and women’s sense of what counts as beautiful.

Perhaps Shelby’s examination of why people tend to present only their ideal selves online accounts for some of how those pressures create a vicious cycle of self-presentation.  Maybe this is part of why people behave so aggressively online, a subject that Justin explores.

Much of the pressure comes from the prominent place of the visual in online spaces.  Katie discusses the rise of digital photography, and another student asks her friends to describe why they use Instagram the way they do.

Speaking of photography, Keshlee clearly enjoys taking selfies, and she’s glad to tell you how to up your selfie game.

Who Am I Online (and Off)?

This student asks whether it’s possible for people, including himself, to be authentic on social media. Dorothee feels like one of her favorite musicians, Ben Howard, can. (Especially if one mostly ignores social media and simply listens to his music.)  

Meanwhile, Austen finds expression by flying high above our heads. Want to learn how to do so yourself? There’s a De-Cal for that.

Lily–lover of food, reading, and golf–asks whether she’s the same person online and off, as does Stephane, who may one day win Wimbledon or improve your eyesight, or both.

This student, employing the evocative metaphor of the silhouette, debates whether we are knowable online, and demonstrates why some, including herself, often choose to represent themselves with avatars.

Kevin wonders whether he’s just being a lemming by joining social media (he says, with good cheer, that the answer is pretty much yes), while Kim lays out the virtues of the most popular networks, and Jocelyne considers the ways in which people interact on Tumblr.

Vanessa makes clear that it’s all about the audience for her, even if the audience–paradoxically for social media–is sometimes just herself, while Danxin reminds us that in the global reach of the Internet era, one’s audience (and one’s online self) can sometimes shift as it crosses borders.

My thanks to my students. (That wasn’t so painful, was it?) I hope you, the public audience, enjoyed their pieces as much as I did.

Postscript:  As I posted these essays here, I sent out the following tweet with a link to this page, and soon thereafter got the following responses from one member of the broader public audience.  

If the students above had even a fraction of Clive Thompson’s 25,000+ Twitter followers take a look at their essays, that will be an audience that is considerably larger and more public than the one they’re used to. Nice to have one of the authors we read cement the point for us.

A Public Audience 1

A Public Audience 2

A compendium of selfie reflections (for my students)

20 Oct
Rembrandt selfie by LoopyDave

Rembrandt selfie by LoopyDave

The modern-day selfie is beyond its cultural oversaturation point.  It’s ubiquitous.  It’s been folded into markers of the elite as the OED’s word of the year for 2013; it’s been turned into a mainstream network sit com that looks like a candidate for quick cancellation this fall; and it’s been spun into a pop music confection that offers a giggle and then disappears like a mouthful of cotton candy:

This ubiquity has resulted in repeated charges that selfie takers (particularly the young) are narcissists who are so self-involved that even when they encounter celebrities they’d rather snap a selfie than have a chat and make a personal connection with the likes of, say, Kirsten Dunst.

Outside the U.S., Muslims taking selfies during the hajj earlier this month were condemned by some as acting in ways that were disrespectful and inappropriate.  Here at home there have been heartfelt warnings, such as this one from rapper Prince Ea, to stop regarding ourselves so much and to start regarding each other.

Yet we keep taking selfies, and keep scolding ourselves for taking selfies.

Megan Garber of The Atlantic argues that we’re in a “Plateau of Productivity” now:  the hype cycle of the selfie is at its end.  We’re getting sick of them even as we continue to snap them, she notes, and this is just the point at which they become interesting to study.  I think she’s right.  (And so does the university where I work.)

The selfie isn’t new; the technology with which we create them is.  As others have noted, Rembrandt painted selfies, as did many other painters, well-known and not, and as have lots of other artists up to the modern day.

(There are also artists, it should be noted, who eschew the practice of self-portraiture.)

The pre-smartphone selfie, then, has a long history, through the ages of painting and the dawn of photography on down to our era of the camera that can be turned on oneself at whim.  We’re further encouraged these days by trying to emulate the positive attention that celebrities get for their own selfies.

So have we always been narcissists at base, just waiting for the right technology to draw it out of us en masse?

The selfie is of a piece with our natural human impulse to declare ourselves, to make our presence known in the world (“Kilroy was here!”), to figure out and express who we are, and as I’ve written elsewhere in this essay from 2007 about the use of “I” in writing, it is in knowing ourselves that we might come to better understand and regard others.

And lest we think this is a uniquely Western phenomenon, we should consider the revelation published in Science earlier this month of the earliest known human selfies to date:  a set of hands stenciled onto a cave wall in Indonesia that are believed to be at least forty thousand years old.

At least some of what’s going on here in America with reactions to selfies is probably a reflection of what de Tocqueville noticed about us during his visits to the U.S. in the 1830s:  the tension that was evident between the populism that gave birth to our democracy and the elitism that we’d supposedly rejected in splitting from aristocratic England.

Alexis de Tocqueville Source:

Alexis de Tocqueville (Source:

We still have the elitist’s desire to distinguish ourselves from the rest (“Look at me.  Aren’t I good looking?”) while also condemning anyone who thinks they’re better than the rest (“Enough with the selfies, you narcissist!”)  And in the critiques, it runs the other way as well:  the self-restraining elitist condemns the masses who take selfies, and the masses keep taking them (while also frowning upon others who do so).

Context matters, of course, including the frequency with which one is inclined to take selfies.  Kim Kardashian endlessly photographing herself is not the same as people wanting to take a selfie with the guy who caught Travis Ishikawa’s pennant-winning home run is not the same as prehistoric human beings blowing paint over their hands, leaving stenciled traces for their descendants to find 40 millennia later.

We quickly read each selfie and judge for ourselves whether we’ve got a narcissist, a braggart, an artist, or a human being making their mark, maybe having some fun.  Maybe all of those at the same time.  We have to take each selfie as it comes, including some delightful new plays on the word, such as shelfie.

So let me conclude by looking at one of mine.  This is one I took on Father’s Day this year and then shared with friends on Facebook.  I called it a “chestie.” 


What have we got here?  A series of messages that my friends might have read into the picture:

Check out this cool t-shirt my daughter gave me.

Aren’t I clever calling this photo a “chestie”?

A shout out to my fellow A’s fans.

Much love to Oakland.

How modest I am to not include my face.

Aren’t I a great and lucky dad to have received this present?  And aren’t you jealous of my good fortune?

I was here on this day, at this time, doing this.

Did I think of all these at the time I took and posted the photo?  No.  Very happy with the gift, I put on the t-shirt, soon after saw people posting Father’s Day photos–some of them selfies–and decided to join in.  I was conscious of the decision to call it a “chestie” and to celebrate the A’s (who were doing so so SO much better at the time), but that was about it.  Snapped the selfie, posted it, and forgot it.  Up until now.

What does my selfie mean?  You be the judge.

And now I will finish this compendium by burying the lede:

I wrote this post partly for myself, out of a desire to personally bookmark some of the public discussions of the selfie that have been taking place of late, and partly for you, in case you’re interested, but mostly I wrote this for my current students, who will be embarking on writing some digital, multimodal essays about online identity in the coming month.  

I offer this brief essay as an example to them, and I’m hoping to encourage them to let me share some of their work publicly here.  Given their facility with technology, I’m sure they can do much more interesting things with digital tools than I’ve managed to do in this blog post.  (Oh, look at that textual selfie he just took–so humble!)


13 Ways of Looking at a “Like”

30 May


13 Ways of Looking at a Like

1)      I like this


2)      OMG, I freaking LOVE this!


3)      I love this so much that I am totally going to share this while wishing I’d found or thought of it first, damn it!


4)      I am completely in alignment with you politically and/or culturally.  Isn’t it great/awful, this thing we agree upon?  I am so glad you expressed it thusly so that I can “like” it and do nothing else about it.


5)      Yes, I hereby acknowledge reading this thing that you wrote.


6)      This thing you are posting, it is complicated and I don’t much feel like getting into it right now.


7)      I feel bad for you.  Hugs!


8)      How about you not being so goddamned self-centered and making a comment on other people’s walls once in a while?


9)      As Gore Vidal once said, this thing you are sharing about your life’s excellence is causing me to die a little.


10)  I something other than like this, precisely, but the attenuated range of expression available to me here forces me to fucking like this.  (Do you have to swear so much?  There are children on Facebook!  The children!)


11)  Well, crap, I liked this person’s thing and that person’s thing, even if it wasn’t really liking exactly (I mean, these things are complicated), so now I have to like your stuff too or you’ll be angry with me.


12)  I’m just clicking at this point.  It makes me feel alive.


13)  This contribution to my social graph is commodifying me into a few pennies that will be added to Mark Zuckerberg’s riches, and so once again I manage to do an injustice to the memory of Lloyd Dobler:


“I don’t want to sell anything, buy anything, or process anything as a career. I don’t want to sell anything bought or processed, or buy anything sold or processed, or process anything sold, bought, or processed, or repair anything sold, bought, or processed. You know, as a career, I don’t want to do that.”



Efficiency and Wiki Humanism (Digital Artifact for #EDCMOOC)

24 Feb

The following is the final blog post related to a MOOC (Massive Open Online Course) I am currently taking on “E-Learning and Digital Cultures,” which is being run by the University of Edinburgh. 

This is my digital artifact, created as a final assignment for the course.

The Efficient Student

It’s a word I’ve been hearing with increasing frequency from my students at U.C. Berkeley over the past decade:

I want to be a more efficient reader.

I want to write more efficiently.

I need to be more efficient with my time.

I want to be efficient.

Who can blame them?  Many of them sign on for crushingly hard schedules, and load up on additional activities and jobs that will pay their way through college, set them up as attractive candidates for post-graduate careers and further education, and, maybe, leave a little time for fun.

And with the skyrocketing rise in tuition at UC, as elsewhere in the country, the need to get the degree ASAP becomes even more pressing:



Sometimes, the word comes out in odd ways:  I want this argument to be more efficient.  “You mean you want it to be more concise or pointed,” I think to myself.  But then maybe that’s not what they mean.

There are a whole host of factors in play here, both historical and contemporary, but (I’m sorry, I want this argument to be efficient) a large measure of efficiency’s ascendancy as a virtue for my students and for our society at large comes from our increasing reliance on the Internet and mobile technologies and the speed with which we’ve come to expect certain (often mechanized) processes to work, the work of a college education among them.

The Efficient Educator

Can educators be more efficient?  Doubtlessly we can, and certainly I want to be.  The more pertinent question, however, is whether we can teach our students more effectively.  Again, the answer is certainly yes.

And digital tools can help make us more efficient.

But does more efficient mean more effective?

Jump away for a moment to take this efficient digression.

Wiki education:  So much of my classroom material has been adapted, cribbed, riffed, cross-referenced, and yes, perhaps even stolen in acts of petty pedagogical larceny from my former teachers, my colleagues, and our predecessors.  How many of our lessons are lessons passed on from others that we have then Wiki-ed, whether substantially or slightly?

I could do the same for others:  leave digital artifacts of my teaching lying about for others to glom on to.  Other teachers could just take it and tweak it.  Students could take it and teach themselves.  They could collaborate on it and figure out a better, more efficient way to work.  And I could go use my time more efficiently, maybe by moving back closer to nature, or evolving in other ways.



After all, my Prezi digression above (Over there?  Under this?  Out of here?) was built from a template someone else created, and then I (sort of) Wiki-ed clumsily onto the template with my own text and other text and images from others. I’ve extended the reach of my own capacities by using digital tools.  Is this a glimpse of what it means to be transhuman?

…Build it, leave it, let someone Wiki it, and I’m gone…


Wiki Wiki…


It starts to sound like a hip hop DJ working the record on the turntable back and forth…




I’m being silly, but I’m also kind of serious.  (And, given the architecture of sampling that scaffolds so much of hip-hop DJ culture, the comparison is apt.)  I wonder what I might upload on-line, what I could off-load into the laps of my students and give them responsibility for doing according to their own paces and aptitudes, and then whether I might be able to offer my “social presence” to them more productively, more…efficiently.

Just so long as they’re not more likely to drop out if I do this.

Though they shouldn’t drop out if I do it right. Right?

The Efficient Human Being

The efficient human being, approaching perfect efficiency, might be post-human or transhuman, or it might be something else, a distinctly Wiki-ized entity, something approaching the Singularity–a Sci Fi idea that in some circles has dropped the “Fi”–when a collective, technologically rooted superior intelligence will supposedly evolve, where the “I” becomes a “We,” and the “We” becomes a Wiki.  A very very smart Wiki.

Manufacturing has long been replacing people with machines. Developers are at work on machines that one day may be able to comment more efficiently on student writing than I could. Content farms use search algorithms to efficiently determine what stories they want their writers to generate in return for a few electronic pennies. Thousands of Chinese laborers and their ilk have been treated like overworked, highly efficient machines (willingly or not) to create the efficient device on which I write this somewhat efficient post and the efficient device on which you read it.

  “Well, John Henry was a little baby…hummmmmm!! 


The John Henry Ver 2

There are plenty of things of vital importance that are inefficient. Whether one is working in the arts, the sciences, or in business, the creative process is inefficient.  The scientific method is exacting though sometimes inefficient. So is the nature of most true human relations, and so is figuring out who we are.

And love? There are many descriptors that come to mind, but efficiency isn’t one of them.

I didn’t mean for this digital artifact to sound anxious or dystopian.  I’m actually more hopeful about the prospects for technology, and for the interfaces between human beings and technology, in higher education and elsewhere, than I seem to be expressing here.  But it’s difficult to control one’s emotions—they’re terribly inefficient.  Always slowing us down, the damn things.

Becoming a more efficient human being?  Yes, that would be nice.

Becoming a human being for whom efficiency is a cardinal virtue?  Not so sure about that.

Perhaps if we just keep it simple.

Two turntables and a microphone.

Two smart phones and a WiFi connection.


Take it, Herbie Hancock:


The Quiet Place to end Week 2 of the Scottish #edcmooc

8 Feb

One of my students shared this yesterday with his classmates, and also with his “classmates.” (I have two sections of the same class who share an online discussion board on Piazza).

My students uniformly loved it.

There are some interesting things to comment on here, but I’ll reserve those and let you have your own experience of it.  All you need is to be in front of your computer with your finger over your space bar and the speakers on.  It takes about 90 seconds of your time.

Click here to enter The Quiet Place Project.  (A note on using it: If the Quiet Place fills your screen on your browser, pressing the F11 function key (on a PC) or simultaneously pressing the command+shift+f keys (on a Mac) backs you out of it.)

Night night.


Will the Revolution Be Monetized? — Week 2 of the Scottish #EDCMOOC

7 Feb

How we feel about technology is, in part, wrapped up in questions of financing.  Or, in the modern parlance, monetizing.

Will I get paid?

What will it cost me?

In this video, we see Microsoft present a dialogue-free vision of a Utopian future in which technology increasingly connects us, giving us more useful information and generally making our lives and work more efficient and convenient:

Then in this short film, we see a different, dystopian vision of a future technology (and company) called “Sight”:

In the first video, we have a company with a lot to gain from technology’s proliferation. Microsoft has monetized its products extensively, and looks to do more of the same; it is well positioned to do so.

In the second video, we have a corporation as invading privacy and individual authenticity in more disturbing ways.  (It hits at least one of the common themes of science fiction marked by Annalee Newitz:  “the privacy apocalypse.”)  Naturally, the money-making company and its representative, the creepy man, are the baddies here.  The individual–the woman thinking she’s out on a regular (if highly technologized) blind date–is the victim.  She doesn’t stand to make money from the transaction.  She stands to gain, or lose, a potential boyfriend. And maybe her dignity.

A Formula for Measuring Utopian Levels

Perhaps there is a partial formula here that could express our levels of optimism or pessimism about the advance of the digital revolution.  I’ll call it the Technological Utopianism Rate.

TU = $/ax

Here, TU (your Technological Utopianism Rate, which expresses your level of positive, hopeful feelings about technology)…

…is a function of $ (the amount of money you stand to make from technology or digital environments)…

…divided by ax (on a scale of 1-100, your relative anxiety level about the personal or societal destabilizing effects of technology).

So let’s say you were a Microsoft executive making $200,000, and you had an anxiety level about technology of 10.  Your TU would be 20,000 (with upward adjustments according to what opportunities for venture capital investment or start-up company potential or salary increases you anticipate.)

In contrast, let’s say you were, oh, a fusty old teacher whose salary was paid by a local school district, with no particular prospects or inclinations to earn money in a digital space (for your teaching or other services) except for your dividends from your retirement account’s investments in Microsoft.  Let’s say that the latter amounts to $1,000, and also that you had a technological anxiety level of 71.  Your TU would be about 14.  This could go lower, even to zero, should your job be made redundant by advances in technology.

The examples and the formula oversimplify the case, but it’s certainly reasonable to say that the more economic stability or gains you stand to make from it, the more enthusiastic you’ll be about technology.

The Revolution Comes to Higher Ed

Now, picking up with that teacher and his ilk, let’s map this equation onto higher education. As Clay Shirky and Nicholas Carr note in their contrasting discussions of MOOCs–Shirky embracing them, Carr advising skepticism–universities and colleges (including UC Berkeley, where I work) are anxious about what online learning means for them.  Will higher education be the next industry to be radically changed?  Depending on which individual at which university you’re talking to, the TU level is lower or higher, but in effect, the aggregated TU is currently at or near zero for most institutions, because the numerator in the equation is zero.  The digital revolution in higher education has not been monetized.  At least not for the schools or their teachers, by and large.

As Shirky and others argue, this may be a good thing.  Traditional higher education is expensive, they note, and is getting more so; it benefits a limited audience; and the effectiveness of its methods of educating undergraduates is suspect.  Universities are overdue for some major changes.  Surely, the logic goes, there is a better,  more efficient, more cost-effective, more widely accessible way of providing education by leveraging technology.   Well-designed MOOCs, offered for free or for minimal cost, accessible to anyone with an Internet connection, might be one such answer.

Three questions to tease out here.  Two of them (and they’re important ones that both Shirky and Carr engage with) I’ll put to the side for now:  first, what changes does higher education need to undergo to revitalize itself and better serve its mission to educate students?  Second, might MOOCs offer an education as good or better than what a typical student might get in a series of university classrooms?

Let’s assume for the moment that MOOCs potentially could offer many of the same benefits, and ask the obvious money question:  who is going to pay for them?

Are the esteemed lecturers from the University of Edinburgh who are facilitating the MOOC I’m currently taking on e-learning and digital cultures being paid by someone other than their home institution?  Clearly their research interests dovetail with the content of the course, so they are getting some benefit from designing the course and helping guide us 41,000 “students” through it, even if they’re doing it pro bono.  But if they were to offer this or other MOOCs in the future, would they be paid?  Would the U of Edinburgh pay them to subsidize the education of us 41,000 free riders?  Or would the instructors primarily rely on what Jerry Brown, the Governor of California, has called “psychic income” for sustenance?

It’s not sustainable.  If good teachers aren’t paid a living wage, if we students don’t pay something, or if the public doesn’t subsidize that free/affordable education, then MOOCs, whatever their virtues, will wither or else be run by hacks looking to make a buck.  It’s the hacks who worry me.

In his brilliant critique of Shirky’s commentary on MOOCs, Aaron Bady notes the ongoing dis-investment in public education here in the U.S.  He also writes:

“Since there is a lot of unmet desire for education out there, and since that desire is glad to have the thing it wants when it finds it for free, it seems all to the good that students can find courses for free.  But while we should ask questions about why venture capitalists are investing so heavily in educational philanthropy, we also need to think more carefully about why there is so much unmet desire in the first place, and why so many people want education without, apparently, being able to pay for it.  Why hasn’t that desire already found a way to become demand, such that it must wait until Silicon Valley venture capitalists show up, benevolently bearing the future in their arms?”

The venture capitalists are there, as Bady notes, because with the gaps in public investment in higher education they see an opportunity to “speculate.”  They see the potential for the revolution in online education to be monetized.  And they want to be in a position to collect if the cash starts to flow.

In Which the Bank Teaches Us a Lesson

There once was a young man, just out of college, looking for a job during a recession.  He searched and searched, and one day, the public relations department of a major bank pulled his resume out of a pile and offered him a job as a writer.  The young man wasn’t too keen on working at a bank, but a job was a job in those recessionary times, and so he took it.  The pay was quite good.

They spoke a different language, these bankers.  They spoke of “leverage” and “cash flow” and returns on equity and investment.  The young man listened, and learned the language.  He wrote their press releases and brochures, he wrote speeches and jokes for the CEO to tell at luncheons.

In public statements and interviews, the CEO was constantly noting his laser-like focus on serving the bank’s shareholders.  Not the customers or the employees–of course the bank was serving those.  But his primary focus was on increasing the value of the shareholders’ investment.  At the time, the naive young man found this a little weird.  Sure banks want to generate profits, he thought, and shareholders are technically the owners, but isn’t the idea to balance a desire for a profitable business with good service to customers and good jobs for hardworking employees?  Well, yes, but only if it serves the bottom line:  driving up that stock price, increasing the market capitalization of the company, for those shareholders.

The bank was about increasing profits at all costs.  The bank demanded logic of the following sort that the young man was once asked to explain to any journalist who cared to call:  “Yes, the bank is raising fees on checking accounts, but only so we can better serve our customers.”

“What the hell do you mean?” the young man imagined reporters asking.  And he supposed there was a certain kind of logic to it:  if the bank collected more fees from its customers, it would have more funds with which it could potentially hire more employees to work in the call centers, who could take more phone calls from those customers who would otherwise have been kept on hold waiting for an explanation of why their fees were going up.  See?  Doesn’t the bank have its customers’ best interests at heart?

You could almost believe you weren’t being spun if the dizziness stopped for long enough.

The young man, of course, was me, and I share this recollection because I find it instructive.  A bank is not a university–its purposes and functions are different.  But in looking at a bank–a kind of meta-monitizer with a relentless focus on profit–we see something to be cautious of in education.

A school doesn’t need to focus on profit in the same way that bank did (and still does), but it does need to operate in a way that is economically sustainable for both teachers and students and the larger community.  MOOCs as they’re currently structured are not sustainable, no matter how engaging they are (as this MOOC has been) or how poorly run they are, which is why some in academia take a look at them and see a future dystopia in higher ed.

If we can find a way to make them sustainable, and not merely monetizable in the Silicon Valley sense of the word, then maybe, just maybe, we might end up with a useful supplement to university instruction that could help foster a higher higher education that all of us presumably want.

In the meantime, I’ve got a high-fee checking account that might be right up your alley.