- Journalist and media critic Chris Hedges recently interviewed Noam Chomsky. Hedge’s writeup of the discussion is up at Truthdig:
Chomsky believes that the propaganda used to manufacture consent, even in the age of digital media, is losing its effectiveness as our reality bears less and less resemblance to the portrayal of reality by the organs of mass media. While state propaganda can still “drive the population into terror and fear and war hysteria, as we saw before the invasion of Iraq,” it is failing to maintain an unquestioned faith in the systems of power. Chomsky credits the Occupy movement, which he describes as a tactic, with “lighting a spark” and, most important, “breaking through the atomization of society.”
“There are all sorts of efforts to separate people from one another,” he said. “The ideal social unit [in the world of state propagandists] is you and your television screen. The Occupy actions brought that down for a large part of the population. People recognized that we could get together and do things for ourselves. We can have a common kitchen. We can have a place for public discourse. We can form our ideas. We can do something. This is an important attack on the core of the means by which the public is controlled. You are not just an individual trying to maximize consumption. You find there are other concerns in life. If those attitudes and associations can be sustained and move in new directions, that will be important.”
- Video of the interview is available on TheRealNews YouTube channel. Here is the first of the three videos:
- In a recent article for The Guardian, Slavoj Žižek discusses how “Wikileaks opened our eyes to the illusion of freedom“:
Not only have we learned a lot about the illegal activities of the US and other great powers. Not only have the WikiLeaks revelations put secret services on the defensive and set in motion legislative acts to better control them. WikiLeaks has achieved much more: millions of ordinary people have become aware of the society in which they live. Something that until now we silently tolerated as unproblematic is rendered problematic.
This is why Assange has been accused of causing so much harm. Yet there is no violence in what WikiLeaks is doing. We all know the classic scene from cartoons: the character reaches a precipice but goes on running, ignoring the fact that there is no ground underfoot; they start to fall only when they look down and notice the abyss. What WikiLeaks is doing is just reminding those in power to look down.
The reaction of all too many people, brainwashed by the media, to WikiLeaks’ revelations could best be summed up by the memorable lines of the final song from Altman’s film Nashville: “You may say I ain’t free but it don’t worry me.” WikiLeaks does make us worry. And, unfortunately, many people don’t like that.
- Ian Bogost recently devoted an entire piece for The Atlantic to a discussion of Darmok, one of my personal favorite episodes of Star Trek: The Next Generation. After a thorough rundown of the episode, Bogost discusses the concepts of metaphor and allegory as depicted in the language of the Tamarian race, and applies the principles to his notion of procedural rhetoric:
“Darmok” gives us one vision of a future in which procedural rhetoric takes precedence over verbal and visual rhetoric, indeed in which the logic of logics subsume the logics of description, appearances, and even of narrative—that preeminent form that even Troi mistakes as paramount to the Children of Tama. The Tamarian’s media ecosystem is the opposite of ours, one in which behaviors are taken as primary, and descriptions as secondary, almost incidental. The Children of Tama are less interesting as aliens than they are as counterfactual versions of us, if we preferred logic over image or description.
At the end of “Darmok,” Riker finds Captain Picard sitting in his ready room, reading from an ancient book rather than off a tablet. “Greek, sir?” Riker asks. “The Homeric Hymns,” Picard responds, “One of the root metaphors of our own culture. “For the next time we encounter the Tamarians…” suggests the first officer. To which his captain replies, “More familiarity with our own mythology might help us relate to theirs.” A charming sentiment, and a move that always works for Star Trek—the juxtaposition of classical antiquity and science-fictional futurism. But Picard gets it wrong one last time. To represent the world as systems of interdependent logics we need not elevate those logics to the level of myth, nor focus on the logics of our myths. Instead, we would have to meditate on the logics in everything, to see the world as one built of weird, rusty machines whose gears squeal as they grind against one another, rather than as stories into which we might write ourselves as possible characters.
- In a much shorter piece, Bogost writes about “Yo”, a new social app that enables users to send just the word “yo” to others:
It’s stupid. There’s no other word for it. But according to TechCrunch, 50,000 people have sent 4 million Yos since the app was launched on, uhm, April Fool’s Day of this year. But sometimes in stupidity we find a kind of frankness, an honesty. For his part, Arbel has rather overstated the matter. “We like to call it context-based messaging,” he told The New York Times. “You understand by the context what is being said.”
Perhaps the problem with Yo isn’t what makes it stupid—its attempt to formalize the meta-communication common to online life—but what makes it gross: the need to contain all human activity within the logics of tech startups. The need to expect something from every idea, even the stupid ones, to feel that they deserve attention, users, data, and, inevitably, payout. Perhaps this is the greatest meta-communicative message of today’s technology scene. And it might not be inaccurate to summarize that message with a singular, guttural “yo.”
- Following last month’s post of David Graeber’s views on “bullshit jobs,” this Salon interview with Graeber discusses the failed forecast of universal leisure time:
Right after my original bullshit jobs piece came out, I used to think that if I wanted, I could start a whole career in job counseling – because so many people were writing to me saying “I realize my job is pointless, but how can I support a family doing something that’s actually worthwhile?” A lot of people who worked the information desk at Zuccotti Park, and other occupations, told me the same thing: young Wall Street types would come up to them and say “I mean, I know you’re right, we’re not doing the world any good doing what we’re doing. But I don’t know how to live on less than a six figure income. I’d have to learn everything over. Could you teach me?”
But I don’t think we can solve the problem by mass individual defection. Or some kind of spiritual awakening. That’s what a lot of people tried in the ‘60s and the result was a savage counter-offensive which made the situation even worse. I think we need to attack the core of the problem, which is that we have an economic system that, by its very nature, will always reward people who make other people’s lives worse and punish those who make them better. I’m thinking of a labor movement, but one very different than the kind we’ve already seen. A labor movement that manages to finally ditch all traces of the ideology that says that work is a value in itself, but rather redefines labor as caring for other people.
- In an article for Al Jazeera, Sarah Kendzior surveys the politics of gentrification and the perils of hipster economics:
Proponents of gentrification will vouch for its benevolence by noting it “cleaned up the neighbourhood”. This is often code for a literal white-washing. The problems that existed in the neighbourhood – poverty, lack of opportunity, struggling populations denied city services – did not go away. They were simply priced out to a new location.
That new location is often an impoverished suburb, which lacks the glamour to make it the object of future renewal efforts. There is no history to attract preservationists because there is nothing in poor suburbs viewed as worth preserving, including the futures of the people forced to live in them. This is blight without beauty, ruin without romance: payday loan stores, dollar stores, unassuming homes and unpaid bills. In the suburbs, poverty looks banal and is overlooked.
In cities, gentrifiers have the political clout – and accompanying racial privilege – to reallocate resources and repair infrastructure. The neighbourhood is “cleaned up” through the removal of its residents. Gentrifiers can then bask in “urban life” – the storied history, the selective nostalgia, the carefully sprinkled grit – while avoiding responsibility to those they displaced.
Hipsters want rubble with guarantee of renewal. They want to move into a memory they have already made.
- At Mute, Dominic Pettman writes about the rise of MOOCs (Massively Open Online Courses) in higher education, and how commodification affects the value of learning:
In the pedagogic trenches, MOOCs are considered a symptom of wider economic patterns which effectively vacuum resources up into the financial stratosphere, leaving those doing the actual work with many more responsibilities, and far less compensation. Basic questions about the sustainability of this model remain unanswered, but it is clear that there is little room for enfranchised, full-time, fully-compensated faculty. Instead, we find an army of adjuncts servicing thousands of students; a situation which brings to mind scenes from Metropolis rather than Dead Poets Society.
For companies pushing MOOCs, education is no different from entertainment: it is simply a question of delivering ‘content.’ But learning to think exclusively via modem is like learning to dance by watching YouTube videos. You may get a sense of it, but no-one is there to point out mistakes, deepen your understanding, contextualise the gestures, shake up your default perspective, and facilitate the process. The role of the professor or instructor is not simply the shepherd for the transmission of information from point A to point B, but the co–forging of new types of knowledge, and critically testing these for various versions of soundness and feasibility. Wisdom may be eternal, but knowledge – both practical and theoretical – evolves over time, and especially exponentially in the last century, with all its accelerated technologies. Knowledge is always mediated, so we must consciously take the tools of mediation into account. Hence the need for a sensitive and responsive guide: someone students can bounce new notions off, rather than simply absorb information from. Without this element, distance learning all too often becomes distanced learning. Just as a class taken remotely usually leads to a sea of remote students.
Marshall McLuhan was half-right when he insisted that the electronic age is ushering in a post-literate society. But no matter how we like to talk of new audio-visual forms of literacy, there is still the ‘typographic man’ pulling the strings, encouraging us to express ourselves alphabetically. Indeed, the electronic and the literate are not mutually exclusive, much as people like to pit them against each other.
- Pettman also quotes Ian Bogost’s comments on distance learning:
The more we buy into the efficiency argument, the more we cede ground to the technolibertarians who believe that a fusion of business and technology will solve all ills. But then again, I think that’s what the proponents of MOOCs want anyway. The issue isn’t online education per se, it’s the logics and rationales that come along with certain implementations of it.
It’s been a long time since the last update (what happened to October?), so this post is extra long in an attempt to catch up.
- I haven’t seen the new Ender’s Game movie, but this review by abbeyotis at Cyborgology calls the film “a lean and contemporary plunge into questions of morality mediated by technology”:
In a world in which interplanetary conflicts play out on screens, the government needs commanders who will never shrug off their campaigns as merely “virtual.” These same commanders must feel the stakes of their simulated battles to be as high as actual warfare (because, of course, they are). Card’s book makes the nostalgic claim that children are useful because they are innocent. Hood’s movie leaves nostalgia by the roadside, making the more complex assertion that they are useful because of their unique socialization to be intimately involved with, rather than detached from, simulations.
- In the ongoing discourse about games criticism and its relation to film reviews, Bob Chipman’s latest Big Picture post uses his own review of the Ender’s Game film as an entry point for a breathless treatise on criticism. The video presents a concise and nuanced overview of arts criticism, from the classical era through film reviews as consumer reports up to the very much in-flux conceptions of games criticism. Personally I find this video sub-genre (where spoken content is crammed into a Tommy gun barrage of word bullets so that the narrator can convey a lot of information in a short running time) irritating and mostly worthless, since the verbal information is being presented faster than the listener can really process it. It reminds me of Film Crit Hulk, someone who writes excellent essays with obvious insight into filmmaking, but whose aesthetic choice (or “gimmick”) to write in all caps is often a distraction from the content and a deterrent to readers. Film Crit Hulk has of course addressed this issue and explained the rationale for this choice, but considering that his more recent articles have dropped the third-person “Hulk speak” writing style the all caps seems to be played out. Nevertheless, I’m sharing the video because Mr. Chipman makes a lot of interesting points, particularly regarding the cultural contexts for the various forms of criticism. Just remember to breathe deeply and monitor your heart rate while watching.
- In this video of a presentation titled Game design: the medium is the message, Jonathan Blow discusses how commercial constraints dictate the form of products from TV shows to video games.
- This somewhat related video from mynextappliance contextualizes Valve’s Steam machine place in gaming history.
- This video from Satchbag’s Goods is ostensibly a review of Hotline Miami, but develops into a discussion of art movements and Kanye West:
- This short interview with Slavoj Žižek in New York magazine continues a trend I’ve noticed since Pervert’s Guide to Ideology has been releasing, wherein writers interviewing Žižek feel compelled to include themselves and their reactions to/interactions with Žižek into their article. Something about a Žižek encounter brings out the gonzo in journalists. The NY mag piece is also notable for this succinct positioning of Žižek’s contribution to critical theory:
Žižek, after all, the Yugoslav-born, Ljubljana-based academic and Hegelian; mascot of the Occupy movement, critic of the Occupy movement; and former Slovenian presidential candidate, whose most infamous contribution to intellectual history remains his redefinition of ideology from a Marxist false consciousness to a Freudian-Lacanian projection of the unconscious. Translation: To Žižek, all politics—from communist to social-democratic—are formed not by deliberate principles of freedom, or equality, but by expressions of repressed desires—shame, guilt, sexual insecurity. We’re convinced we’re drawing conclusions from an interpretable world when we’re actually just suffering involuntary psychic fantasies.
- Wired UK reported on university students who turned maps of seventeenth century London into a detailed 3D world:
Following the development of the environment on the team’s blog you can see some of the gaps between what data was deemed noteworthy or worth recording in the seventeenth century and the level of detail we now expect in maps and other infographics. For example, the team struggled to pinpoint the exact location on Pudding Lane of the bakery where the Great Fire of London is thought to have originated and so just ended up placing it halfway along.
- Stephen Totilo reviewed the new pirate-themed Assassin’s Creed game for the New York Times. I haven’t played the game, but I love that the sections of the game set in the present day have shifted from the standard global conspiracy tropes seen in the earlier installments to postmodern self-referential and meta-fictional framing:
Curiously, a new character is emerging in the series: Ubisoft itself, presented mostly in the form of self-parody in the guise of a fictional video game company, Abstergo Entertainment. We can play small sections as a developer in Abstergo’s Montreal headquarters. Our job is to help turn Kenway’s life — mined through DNA-sniffing gadgetry — into a mass-market video game adventure. We can also read management’s emails. The team debates whether games of this type could sell well if they focused more on peaceful, uplifting moments of humanity. Conflict is needed, someone argues. Violence sells.
It turns out that Abstergo is also a front for the villainous Templars, who search for history’s secrets when not creating entertainment to numb the population. In these sections, Ubisoft almost too cheekily aligns itself with the bad guys and justifies its inevitable 2015 Assassin’s Creed, set during yet another violent moment in world history.
- Speaking of postmodern, self-referential, meta-fictional video games: The Stanley Parable was released late last month. There has already been a bevy of analysis written about the game, but I am waiting for the Mac release to play the game and doing my best to avoid spoilers in the meantime. Brenna Hillier’s post at VG24/7 is spoiler free (assuming you are at least familiar with the games premise, or its original incarnation as a Half Life mod), and calls The Stanley parable “a reaction against, commentary upon, critique and celebration of narrative-driven game design”:
The Stanley Parable wants you to think about it. The Stanley Parable, despite its very limited inputs (you can’t even jump, and very few objects are interactive) looks at those parts of first-person gaming that are least easy to design for – exploration and messing with the game’s engine – and foregrounds them. It takes the very limitations of traditional gaming narratives and uses them to ruthlessly expose their own flaws.
- An article at Techcrunch looks at how the Twitter-acquired Bluefin Labs “took the academic subject of semiotics and made it something “central” to the future of Twitter’s business“:
Roy’s research focus prior to founding Bluefin, and continued interest while running the company, has to do with how both artificial and human intelligences learn language. In studying this process, he determined that the most important factor in meaning making was the interaction between human beings: non one learns language in a vacuum, after all. That lesson helped inform his work at Twitter, which started with mapping the connection between social network activity and live broadcast television.
- Nathan at metopal posted their paper posing the question: What happens when we stop thinking about videogames as cinema and instead think of them through other media, like fashion, dance, or architecture?
Aspiring to cinematic qualities is not bad in an of itself, nor do I mean to shame fellow game writers, but developers and their attendant press tend to be myopic in their point of view, both figuratively and literally. If we continually view videogames through a monocular lens, we miss much of their potential. And moreover, we begin to use ‘cinematic’ reflexively without taking the time to explain what the hell that word means.
Metaphor is a powerful tool. Thinking videogames through other media can reframe our expectations of what games can do, challenge our design habits, and reconfigure our critical vocabularies. To crib a quote from Andy Warhol, we get ‘a new idea, a new look, a new sex, a new pair of underwear.’ And as I hinted before, it turns out that fashion and videogames have some uncanny similarities.
- John Powers at the Airship posted this great longform piece on the political economy of zombies:
Zombies started their life in the Hollywood of the 1930s and ‘40s as simplistic stand-ins for racist xenophobia. Post-millennial zombies have been hot-rodded by Danny Boyle and made into a subversive form of utopia. That grim utopianism was globalized by Max Brooks, and now Brad Pitt and his partners are working to transform it into a global franchise. But if zombies are to stay relevant, it will rely on the shambling monsters’ ability to stay subversive – and real subversive shocks and terror are not dystopian. They are utopian.
- This article at The Conversation addresses the “touchy subject” of Apple’s Touch ID:
Ironically, our bodies now must make physical contact with devices dictating access to the real; Apple’s Touch ID sensor can discern for the most part if we are actually alive. This way, we don’t end up trying to find our stolen fingers on the black market, or prevent others from 3D scanning them to gain access to our lives.
This is a monumental shift from when Apple released its first iPhone just six years ago. It’s a touchy subject: fingerprinting authentication means we confer our trust in an inanimate object to manage our animate selves – our biology is verified, digitised, encrypted, as they are handed over to our devices.
- In the wake of the Silk Road shut down last month, Chloe Albanesius at PC Mag asks: What was Silk Road and how did it work?
Can you really buy heroin on the Web as easily as you might purchase the latest best-seller from Amazon? Not exactly, but as the FBI explained in its complaint, it wasn’t exactly rocket science, thanks to Tor and some bitcoins. Here’s a rundown of how Silk Road worked before the feds swooped in.
- Henry Jenkins posted the transcript of an interview with Mark J.P. Wolf. The theme of the discussion is “imaginary worlds,” and they touch upon the narratology vs. ludology conflict in gaming:
The interactivity vs. storytelling debate is really a question of the author saying either “You choose” (interaction) or “I choose” (storytelling) regarding the events experienced; it can be all of one or all of the other, or some of each to varying degrees; and even when the author says “You choose”, you are still choosing from a set of options chosen by the author. So it’s not just a question of how many choices you make, but how many options there are per choice. Immersion, however, is a different issue, I think, which does not always rely on choice (such as immersive novels), unless you want to count “Continue reading” and “Stop reading” as two options you are constantly asked to choose between.
- Finally, GamesForChange has uploaded video of Ian Bogost’s keynote address from this year’s Games for Change Festival. Bogost extolls the virtues of “earnestness” over “seriousness” in game design: