Tagged: pedagogy

MISC Monday: MLK media literacy; social media stress; the attention economy, and more

Woman_reading_a_book_on_an_eReader

Examine the life and legacy of Dr. Martin Luther King Jr. and the Civil Rights Movement with hundreds of PBS LearningMedia resources.  Here is a sampling of resources from the extensive offering in PBS LearningMedia. Use these resources to explore media literacy from historical documentaries to media coverage of social movements.

Among the survey’s major findings is that women are much more likely than men to feel stressed after becoming aware of stressful events in the lives of others in their networks.

“Stress is kind of contagious in that way,” said Keith Hampton, an associate professor at Rutgers University and the chief author of the report. “There’s a circle of sharing and caring and stress.”

In a survey of 1,801 adults, Pew found that frequent engagement with digital services wasn’t directly correlated to increased stress. Women who used social media heavily even recorded lower stress. The survey relied on the Perceived Stress Scale, a widely used stress-measurement tool developed in the early 1980s.

“We began to work fully expecting that the conventional wisdom was right, that these technologies add to stress,” said Lee Rainie, the director of Internet, science, and technology research at Pew. “So it was a real shock when [we] first looked at the data and … there was no association between technology use, especially heavy technology use, and stress.”

The higher incidence of stress among the subset of technology users who are aware of stressful events in the lives of others is something that Hampton and his colleagues call “the cost of caring.”

“You can use these technologies and, as a woman, it’s probably going to be beneficial for your level of stress. But every now and then, bad things are going to happen to people you know, and there’s going to be a cost for that,” Hampton said.

The real danger we face from computer automation is dependency. Our inclination to assume that computers provide a sufficient substitute for our own intelligence has made us all too eager to hand important work over to software and accept a subservient role for ourselves. In designing automated systems, engineers and programmers also tend to put the interests of technology ahead of the interests of people. They transfer as much work as possible to the software, leaving us humans with passive and routine tasks, such as entering data and monitoring readouts. Recent studies of the effects of automation on work reveal how easily even very skilled people can develop a deadening reliance on computers. Trusting the software to handle any challenges that may arise, the workers fall victim to a phenomenon called “automation complacency”.

Should we be scared of the future?
I think we should be worried of the future. We are putting ourselves passively into the hands of those who design the systems. We need to think critically about that, even as we maintain our enthusiasm of the great inventions that are happening. I’m not a Luddite. I’m not saying we should trash our laptops and run off to the woods.

We’re basically living out Freud’s death drive, trying our best to turn ourselves into inorganic lumps.
Even before Freud, Marx made the point that the underlying desire of technology seemed to be to create animate technology and inanimate humans. If you look at the original radios, they were transmission as well as reception devices, but before long most people just stopped transmitting and started listening.

From an educational perspective, what we must understand is the relationship between information and meaning. Meaning is not an inevitable outcome of access to information but rather, emerges slowly when one has cultivated his or her abilities to incorporate that information in purposeful and ethical ways. Very often this process requires a slowdown rather than a speedup, the latter of which being a primary bias of many digital technologies. The most powerful educational experiences stem from the relationships formed between teacher and student, peer and peer. A smart classroom isn’t necessarily one that includes the latest technologies, but one that facilitates greater interaction among teachers and students, and responsibility for the environment within which one learns. A smart classroom is thus spatially, not primarily technologically, smart. While the two are certainly not mutually exclusive (and much has been written on both), we do ourselves a disservice when privileging the latter over the former.

  • Dowd’s argument here is similar to Carr’s thoughts on MOOCs:

In education, computers are also falling short of expectations. Just a couple of years ago, everyone thought that massive open online courses – Moocs – would revolutionise universities. Classrooms and teachers seemed horribly outdated when compared to the precision and efficiency of computerised lessons. And yet Moocs have largely been a flop. We seem to have underestimated the intangible benefits of bringing students together with a real teacher in a real place. Inspiration and learning don’t flow so well through fibre-optic cables.

  • MediaPost editor Steve Smith writes about his relationship with his iPhone, calling it life’s new remote:

The idea that the cell phone is an extension of the self is about as old as the device itself. We all recall the hackneyed “pass your phone to the person next to you” thought experiment at trade shows four or five years ago. It was designed to make the point of how “personally” we take these devices.

And now the extraordinary and unprecedented intimacy of these media devices is a part of legal precedent. The recent Supreme Court ruling limiting searches of cell phone contents grounded the unanimous opinion on an extraordinary observation. Chief Justice John Roberts described these devices as being “such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy.”

We are only beginning to understand the extent to which these devices are blending the functionality of media with that of real world tools. And it is in line with one of Marshall McLuhan’s core observations in his “Understanding Media” book decades ago.

As early as 1971 Herbert Simon observed that “what information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it”. Thus instead of reaping the benefits of the digital revolution we are intellectually deprived by our inability to filter out sensory junk in order to translate information into knowledge. As a result, we are collectively wiser, in that we can retrieve all the wisdom of the world in a few minutes, but individually more ignorant, because we lack the time, self-control, or curiosity to do it.

There are also psychological consequences of the distraction economy. Although it is too soon to observe any significant effects from technology on our brains, it is plausible to imagine that long-term effects will occur. As Nicholas Carr noted in The Shallows: What the internet is doing to our brains, repeated exposure to online media demands a cognitive change from deeper intellectual processing, such as focused and critical thinking, to fast autopilot processes, such as skimming and scanning, shifting neural activity from the hippocampus (the area of the brain involved in deep thinking) to the prefrontal cortex (the part of the brain engaged in rapid, subconscious transactions). In other words, we are trading speed for accuracy and prioritise impulsive decision-making over deliberate judgment. In the words of Carr: “The internet is an interruption system. It seizes our attention only to scramble it”.

The research carried out by the Harvard Medical School and published in the journal Proceedings of the National Academy of Sciences studied the sleeping patterns of 12 volunteers over a two-week period. Each individual read a book before their strict 10PM bedtime — spending five days with an iPad and five days with a paper book. The scientists found that when reading on a lit screen, volunteers took an average of 10 minutes longer to fall asleep and received 10 minutes less REM sleep. Regular blood samples showed they also had lower levels of the sleep hormone melatonin consistent with a circadian cycle delayed by one and a half hour.

Ever since the frequent cocaine user and hater of sleep Thomas Edison flicked on the first commercially-viable electric lightbulb, a process has taken hold through which the darkness of sleep time has been systematically deconstructed and illuminated.

Most of us now live in insomniac cities with starless skies, full of twinkling neon signage and flickering gadgets that beg us to stay awake longer and longer. But for all this technological innovation, we still must submit to our diurnal rhythm if we want to stay alive.

And even though sleep may “frustrate and confound strategies to exploit and reshape it,” as Crary says, it, like anything, remains a target of exploitation and reshaping – and in some cases, all-out elimination.

What is striking about this corporate monopolization of the internet is that all the wealth and power has gone to a small number of absolutely enormous firms. As we enter 2015, 13 of the 33 most valuable corporations in the United States are internet firms, and nearly all of them enjoy monopolistic market power as economists have traditionally used the term. If you continue to scan down the list there are precious few internet firms to be found. There is not much of a middle class or even an upper-middle class of internet corporations to be found.

This poses a fundamental problem for democracy, though it is one that mainstream commentators and scholars appear reluctant to acknowledge: If economic power is concentrated in a few powerful hands you have the political economy for feudalism, or authoritarianism, not democracy. Concentrated economic power invariably overwhelms the political equality democracy requires, leading to routinized corruption and an end of the rule of law. That is where we are today in the United States.

The short answer is technology. Yes, Facebook really did ruin everything. The explosion in communication technologies over the past decades has re-oriented society and put more psychological strain on us all to find our identities and meaning. For some people, the way to ease this strain is to actually reject complexity and ambiguity for absolutist beliefs and traditional ideals.

Philosopher Charles Taylor wrote that it would be just as difficult to not believe in God in 1500 as it is to believe in God in the year 2000. Obviously, most of humanity believes in God today, but it’s certainly become a much more complicated endeavor. With the emergence of modern science, evolution, liberal democracy, and worldwide 24-hour news coverage of corruption, atrocities, war and religious hypocrisy, today a person of faith has their beliefs challenged more in a week than a person a few generations ago would have in half a lifetime.

Advertisements

Your brain on Kindle; 21st Century media literacy; how Disney shapes youth identity

Brian Snyder/Reuters

Brian Snyder/Reuters

Neuroscience, in fact, has revealed that humans use different parts of the brain when reading from a piece of paper or from a screen. So the more you read on screens, the more your mind shifts towards “non-linear” reading — a practice that involves things like skimming a screen or having your eyes dart around a web page.

Using the technology approach, the iPhone is the “school” and anyone who uses it adeptly is the master and anyone over 30 is, well, handicapped at best.   New technologies enable this approach because now, hardware and software are available and production has been democratized — everyone is a producer, a collaborator, a distributor and a participant.  While experiential and project-based learning is truly exciting and an important component of media literacy, it is not synonymous because the outcome of the technology approach is often limited to technical proficiency without critical autonomy. Whether using an iPad, a pencil or a videocam, pressing the right buttons is important but not enough!

The information, entertainment and cultural pedagogy disseminated by massive multimedia corporations have become central in shaping and influencing every waking moment of children’s daily lives – all toward a lifetime of constant, unthinking consumption. Consumer culture in the United States and increasingly across the globe, does more than undermine the ideals of a secure and happy childhood: it exhibits the bad faith of a society in which, for children, “there can be only one kind of value, market value; one kind of success, profit; one kind of existence, commodities; and one kind of social relationship, markets.”

A ticklish subject: Decrying, defending Žižek as teacher

Wikimedia Commons

Wikimedia Commons

  • Slavoj Žižek’s pedagogy became a topic of debate among critics and supporters of the philosopher after video of an interview with Žižek was posted to YouTube. In the 10-minute video, recorded in April at the 2014 Žižek Conference in Cincinnati, Žižek discusses his loathing of office hours, among other subjects. Regarding classes he has taught in the U.S., Žižek recalls telling students “If you don’t give me any of your shitty papers, you get an A”. Here is the video on YouTube, or you can watch the embed below:

I even told students at the New School for example… if you don’t give me any of your shitty papers, you get an A. If you give me a paper I may read it and not like it and you can get a lower grade.

  • And regarding office hours:

I can’t imagine a worse experience than some idiot comes there and starts to ask you questions, which is still tolerable. The problem is that here in the United States students tend to be so open that sooner or later, if you’re kind to them, they even start to ask you personal questions [about] private problems… What should I tell them?

Zizek has always been vocal about his general disdain for students and humanity writ large. He once admitted in 2008 that seeing stupid people happy makes him depressed, before describing teaching as the worst job he has ever had.

[…]

On a personal note, I was once told at the New School by a senior faculty member that Zizek would fill up his sign-up sheet for office hours with fake names to avoid student contact. I still wonder if that story is true, but now it doesn’t seem so out of character.

I have no idea what a superstar like Žižek gets paid, and I don’t know if he actually fills his office-hours sign-up sheet with fake names so that none of the “boring idiots” come and bother him with their stupid problems, as one New School faculty member has apparently claimed. But I feel safe in guessing that he earns more to not-grade one “shitty paper” than many professors do in a semester.

The real problem with Žižek, in any case, isn’t that he feels this way or that he says these things aloud. It’s that he does so and people think it’s hilarious. It’s that his view is, believe it or not, a common “superstar” view of students—so common, in fact, that if you work at a research university and actually like teaching, you should maybe pretend you don’t, lest you appear not “serious” enough about your research.

[…]

The academy is in crisis. The humanities’ relevance is questioned obnoxiously on a near-daily basis. Humanists need to think carefully about who our heroes are, and who should represent our disciplines to the public. Maybe, just maybe, this Ži-jerk has finally proved himself unsuited to the task.

I’m sure all of us have stories of colleagues basically slandering their students, and there is no more common complaint in the academic world than about the tedium of grading. I would venture to say that much of the resentment of Zizek’s attitudes stems from an unacknowledged desire to do exactly the things they’re castigating Zizek for. Wouldn’t it be awesome to be able to tell the students what I really think of them? Wouldn’t it be great not to have to deal with their crappy writing? Wouldn’t it be amazing to finally take the university at its word, valuing research absolutely and exclusively while making at best a token gesture toward teaching?

Indeed, it was disdain for teaching that made it so tempting to outsource pedagogical labor to grad students and underpaid adjuncts so that real professors could have the space to do real academic work. Zizek’s opinions aren’t some crazy outlier, they’re the structuring principles of our system of academic labor.

That I have met Zizek personally and can attest that he is no jerk is a minor point. That I personally witnessed him reject dinner with established professors and instead choose to sit with undergraduate students at a University of Rochester event is also fairly trivial but instructive about his actual attitude towards students.

[…]

No one has been more outspoken, or effective, about combating this “crisis” [in academia] than Zizek. He is dogmatic in his steadfast criticism of the Bologna reforms in Europe. Rejecting globalization’s call for experts instead of critically-thinking humanists cannot be accomplished through office hours and friendly teaching styles.

The risk of losing liberal arts is indelibly linked to the intrusion of unfettered (ostensibly) “market” mechanisms throughout human life. Where there used to be at least some sanctuary, now there is none. Education is just one of the last to fall.

Mrs. Schuman also uses the basic strategy that is usually employed by politically right-wing authors who try to dismiss Žižek’s political engagements, more specifically taking one of his jokes out of context. I have followed some comments of various professors online about this specific Žižek’s statement of ‘not reading and examining student papers’ and most of them have shared some sympathy with Žižek’s joke, agreeing that doing such work actually consumes a huge amount of their time for which they don’t receive much gratification. Mrs. Schuman fails to notice that Žižek is employed as senior researcher and not as a regular professor, at least at his faculty in Ljubljana and to attack him for not grading papers there is simply absurd, since it’s a formal post and it’s quite rare that he appears to deliver a lecture there at all. She also, like most of the people bashing him, fails to notice the statement was a joke, like many of phrases like this mainly produced in recorded interviews to provoke a media response, so in a way Mrs. Schuman’s text could have said to have been expected before it was ever written.

But anyone who is familiar with how he develops theory should notice that he is also the International Director of the Institute for the Humanities at Birbkeck in London, where he annually holds very serious ‘masterclasses’, which consist of multiple successive days of lectures followed by discussions with his students, where there is more than enough opportunity for those who are genuinely interested in his work to provide their comments and criticism, and more importantly, get a first-person perspective and a chance to collaborate on the development of his theory; his lectures there have often ended up as important parts of his big philosophical tomes later on. So he does teach classes, very important classes which have philosophical consequences, and Mrs. Schuman repeats the accusation against him simply because she doesn’t seem to be very familiar with his work. Those that are obsessed with Žižek’s place in the employment scheme of academia usually harbour resentment and envy due to their personal lack of luck at getting a satisfying job in the academic machinery and are just searching for quick attempts at dismissal.

Critical Pedagogy and Imperialism; social media and commodity fetishism

Gramsci has had a huge impact on critical pedagogy especially because of the importance he attached to the role of culture, in both its highbrow and popular forms, in the process of hegemony which combines rule by force with rule by consent. His discussion on the role of intellectuals in this process also infuenced discussions centering around educators as cultural workers in the critical pedagogy field. Henry Giroux has been particularly influential here. One issue which deserves greater treatment in critical pedagogy, in my view, is that of ‘powerful knowledge’ which, though not necessarily popular knowledge and also needs to be problematised, should still be mastered for one not to remain at the margins of political life.

[…]

Following Freire, I would say: the commitment to teaching is a political commitment because education is a political act. There is no such thing as a neutral educaton. We must always ask on whose side are we when we teach?  More importantly we should ask, with whom are we educating and learning? I ask this question in the spirit of Freire’s emphasis on working with rather than for  the oppressed.

In tying Marxist ideology to social media, there are a number of things to clarify, as the comparison is not a perfect one. Perhaps the most questionable caveat is the ownership of the modes of production. In the social media model, it can be said that the proletariat themselves own the modes of productions since they typically own the computer or devices that they are using to channel their intellectual labor through. Additionally, almost all popular social media networks today allow users to retain the copyright of the content that they post  (Facebook, a; MySpace, n.d.; Twitter, n.d.). Thus, it would seem that making the argument that users are alienated from the results of their intellectual labor power is a moot point.

[…]

I humbly suggest that in the social media model, owning the output or product of intellectual labor power has little if anything to do with Marx’s species being. Instead, I feel that it is the social connections created, broken, strengthened, or weakened that feed directly to the worker’s species being. Since the output of the intellectual labor power in this case is not a tangible good, the only “finished product” that the worker can place value in and not be alienated from is the actual social connection that their output generates; not the actual output itself. This allows for a supra or meta level of social connection above that of the social connections embodied in physical outputs outlined by Marx.

 

Graeber on labor and leisure; the perils of hipster economics; and the educational value of MOOCs

Right after my original bullshit jobs piece came out, I used to think that if I wanted, I could start a whole career in job counseling – because so many people were writing to me saying “I realize my job is pointless, but how can I support a family doing something that’s actually worthwhile?” A lot of people who worked the information desk at Zuccotti Park, and other occupations, told me the same thing: young Wall Street types would come up to them and say “I mean, I know you’re right, we’re not doing the world any good doing what we’re doing. But I don’t know how to live on less than a six figure income. I’d have to learn everything over. Could you teach me?”

But I don’t think we can solve the problem by mass individual defection. Or some kind of spiritual awakening. That’s what a lot of people tried in the ‘60s and the result was a savage counter-offensive which made the situation even worse. I think we need to attack the core of the problem, which is that we have an economic system that, by its very nature, will always reward people who make other people’s lives worse and punish those who make them better. I’m thinking of a labor movement, but one very different than the kind we’ve already seen. A labor movement that manages to finally ditch all traces of the ideology that says that work is a value in itself, but rather redefines labor as caring for other people.

Proponents of gentrification will vouch for its benevolence by noting it “cleaned up the neighbourhood”. This is often code for a literal white-washing. The problems that existed in the neighbourhood – poverty, lack of opportunity, struggling populations denied city services – did not go away. They were simply priced out to a new location.

That new location is often an impoverished suburb, which lacks the glamour to make it the object of future renewal efforts. There is no history to attract preservationists because there is nothing in poor suburbs viewed as worth preserving, including the futures of the people forced to live in them. This is blight without beauty, ruin without romance: payday loan stores, dollar stores, unassuming homes and unpaid bills. In the suburbs, poverty looks banal and is overlooked.

In cities, gentrifiers have the political clout – and accompanying racial privilege – to reallocate resources and repair infrastructure. The neighbourhood is “cleaned up” through the removal of its residents. Gentrifiers can then bask in “urban life” – the storied history, the selective nostalgia, the carefully sprinkled grit – while avoiding responsibility to those they displaced.

Hipsters want rubble with guarantee of renewal. They want to move into a memory they have already made.

In the pedagogic trenches, MOOCs are considered a symptom of wider economic patterns which effectively vacuum resources up into the financial stratosphere, leaving those doing the actual work with many more responsibilities, and far less compensation. Basic questions about the sustainability of this model remain unanswered, but it is clear that there is little room for enfranchised, full-time, fully-compensated faculty. Instead, we find an army of adjuncts servicing thousands of students; a situation which brings to mind scenes from Metropolis rather than Dead Poets Society.

[…]

For companies pushing MOOCs, education is no different from entertainment: it is simply a question of delivering ‘content.’ But learning to think exclusively via modem is like learning to dance by watching YouTube videos. You may get a sense of it, but no-one is there to point out mistakes, deepen your understanding, contextualise the gestures, shake up your default perspective, and facilitate the process. The role of the professor or instructor is not simply the shepherd for the transmission of information from point A to point B, but the coforging of new types of knowledge, and critically testing these for various versions of soundness and feasibility. Wisdom may be eternal, but knowledge – both practical and theoretical – evolves over time, and especially exponentially in the last century, with all its accelerated technologies. Knowledge is always mediated, so we must consciously take the tools of mediation into account. Hence the need for a sensitive and responsive guide: someone students can bounce new notions off, rather than simply absorb information from. Without this element, distance learning all too often becomes distanced learning. Just as a class taken remotely usually leads to a sea of remote students.

[…]

Marshall McLuhan was half-right when he insisted that the electronic age is ushering in a post-literate society. But no matter how we like to talk of new audio-visual forms of literacy, there is still the ‘typographic man’ pulling the strings, encouraging us to express ourselves alphabetically. Indeed, the electronic and the literate are not mutually exclusive, much as people like to pit them against each other.

  • Pettman also quotes Ian Bogost’s comments on distance learning:

The more we buy into the efficiency argument, the more we cede ground to the technolibertarians who believe that a fusion of business and technology will solve all ills. But then again, I think that’s what the proponents of MOOCs want anyway. The issue isn’t online education per se, it’s the logics and rationales that come along with certain implementations of it.

Manifesto for a Ludic Century, ludonarrative dissonance in GTA, games and mindf*cks, and more

Systems, play, design: these are not just aspects of the Ludic Century, they are also elements of gaming literacy. Literacy is about creating and understanding meaning, which allows people to write (create) and read (understand).

New literacies, such as visual and technological literacy, have also been identified in recent decades. However, to be truly literate in the Ludic Century also requires gaming literacy. The rise of games in our culture is both cause and effect of gaming literacy in the Ludic Century.

So, perhaps there is one fundamental challenge for the Manifesto for a Ludic Century: would a truly ludic century be a century of manifestos? Of declaring simple principles rather than embracing systems? Or, is the Ludic Manifesto meant to be the last manifesto, the manifesto to end manifestos, replacing simple answers with the complexity of “information at play?”

Might we conclude: videogames are the first creative medium to fully emerge after Marshall McLuhan. By the time they became popular, media ecology as a method was well-known. McLuhan was a popular icon. By the time the first generation of videogame players was becoming adults, McLuhan had become a trope. When the then-new publication Wired Magazine named him their “patron saint” in 1993, the editors didn’t even bother to explain what that meant. They didn’t need to.

By the time videogame studies became a going concern, McLuhan was gospel. So much so that we don’t even talk about him. To use McLuhan’s own language of the tetrad, game studies have enhanced or accelerated media ecology itself, to the point that the idea of studying the medium itself over its content has become a natural order.

Generally speaking, educators have warmed to the idea of the flipped classroom far more than that of the MOOC. That move might be injudicious, as the two are intimately connected. It’s no accident that private, for-profit MOOC startups like Coursera have advocated for flipped classrooms, since those organizations have much to gain from their endorsement by universities. MOOCs rely on the short, video lecture as the backbone of a new educational beast, after all. Whether in the context of an all-online or a “hybrid” course, a flipped classroom takes the video lecture as a new standard for knowledge delivery and transfers that experience from the lecture hall to the laptop.

  • Also, with increased awareness of Animal Crossing following from the latest game’s release for the Nintendo 3DS, Bogost recently posted an excerpt from his 2007 book Persuasive Games discussing consumption and naturalism in Animal Crossing:

Animal Crossing deploys a procedural rhetoric about the repetition of mundane work as a consequence of contemporary material property ideals. When my (then) five-year-old began playing the game seriously, he quickly recognized the dilemma he faced. On the one hand, he wanted to spend the money he had earned from collecting fruit and bugs on new furniture, carpets, and shirts. On the other hand, he wanted to pay off his house so he could get a bigger one like mine.

Ludonarrative dissonance is when the story the game is telling you and your gameplay experience somehow don’t match up. As an example, this was a particular issue in Rockstar’s most recent game, Max Payne 3. Max constantly makes remarks about how terrible he is at his job, even though he does more than is humanly possible to try to protect his employers – including making perfect one-handed head shots in mid-air while drunk and high on painkillers. The disparity and the dissonance between the narrative of the story and the gameplay leave things feeling off kilter and poorly inter-connnected. It doesn’t make sense or fit with your experience so it feels wrong and damages the cohesiveness of the game world and story. It’s like when you go on a old-lady only murdering spree as Niko, who is supposed to be a reluctant killer with a traumatic past, not a gerontophobic misogynist.

What I find strange, in light of our supposed anti-irony cultural moment, is a kind of old-fashioned ironic conceit behind a number of recent critical darlings in the commercial videogame space. 2007’s Bioshock and this year’s Bioshock: Infinite are both about the irony of expecting ‘meaningful choice’ to live in an artificial dome of technological and commercial constraints. Last year’s Spec Ops: The Line offers a grim alchemy of self-deprecation and preemptive disdain for its audience. The Grand Theft Auto series has always maintained a cool, dismissive cynicism beneath its gleefully absurd mayhem. These games frame choice as illusory and experience as artificial. They are expensive, explosive parodies of free will.

To cut straight to the heart of it, Bioshock seems to suffer from a powerful dissonance between what it is about as a game, and what it is about as a story. By throwing the narrative and ludic elements of the work into opposition, the game seems to openly mock the player for having believed in the fiction of the game at all. The leveraging of the game’s narrative structure against its ludic structure all but destroys the player’s ability to feel connected to either, forcing the player to either abandon the game in protest (which I almost did) or simply accept that the game cannot be enjoyed as both a game and a story, and to then finish it for the mere sake of finishing it.

The post itself makes a very important point: games, for the most part, can’t pull the Mindfuck like movies can because of the nature of the kind of storytelling to which most games are confined, which is predicated on a particular kind of interaction. Watching a movie may not be an entirely passive experience, but it’s clearly more passive than a game. You may identify with the characters on the screen, but you’re not meant to implicitly think of yourself as them. You’re not engaging in the kind of subtle roleplaying that most (mainstream) games encourage. You are not adopting an avatar. In a game, you are your profile, you are the character you create, and you are also to a certain degree the character that the game sets in front of you. I may be watching everything Lara Croft does from behind her, but I also control her; to the extent that she has choices, I make them. I get her from point A to B, and if she fails it’s my fault. When I talk about something that happened in the game, I don’t say that Lara did it. I say that I did.

Anachrony is a common storytelling technique in which events are narrated out of chronological order. A familiar example is a flashback, where story time jumps to the past for a bit, before returning to the present. The term “nonlinear narrative” is also sometimes used for this kind of out-of-order storytelling (somewhat less precisely).

While it’s a common technique in literature and film, anachrony is widely seen as more problematic to use in games, perhaps even to the point of being unusable. If the player’s actions during a flashback scene imply a future that differs considerably from the one already presented in a present-day scene (say, the player kills someone who they had been talking to in a present-day scene, or commits suicide in a flashback), this produces an inconsistent narrative. The root of the problem is that players generally have degree of freedom of action, so flashbacks are less like the case in literature and film—where already decided events are simply narrated out of order—and more like time travel, where the player travels back in time and can mess up the timeline.

The first of the books are set to be published in early 2014. Some of the writers that will be published by Press Select in its first round have written for publications like Edge magazine, Kotaku, Kill Screen and personal blogs, including writers like Chris Dahlen, Michael Abbott, Jenn Frank, Jason Killingsworth, Maddy Myers, Tim Rogers, Patricia Hernandez and Robert Yang.