- Immediately following Elliot Rodger’s spree killing in Isla Vista, CA last month Internet users discovered his YouTube channel and a 140-page autobiographical screed, dubbed a “manifesto” by the media. The written document and the videos documented Rodger’s sexual frustration and his chronic inability to connect with other people. He specifically lashed out at women for forcing him ” to endure an existence of loneliness, rejection and unfulfilled desires” and causing his violent “retribution”. Commentators and the popular press framed the killings as an outcome of misogynistic ideology, with headlines such as: How misogyny kills men, further proof that misogyny kills, and Elliot Rodger proves the danger of everyday sexism. Slate contributor Amanda Hess wrote:
Elliot Rodger targeted women out of entitlement, their male partners out of jealousy, and unrelated male bystanders out of expedience. This is not ammunition for an argument that he was a misandrist at heart—it’s evidence of the horrific extent of misogyny’s cultural reach.
- Writing at Cyborgology, Jenny Davis saw the tragedy as a terrible lesson in misogyny and digital dualism, the Cyborgology blog’s pet theory:
His parents saw the digitally mediated rants and contacted his therapist and a social worker, who contacted a mental health hotline. These were the proper steps. But those who interviewed Rodger found him to be a “perfectly polite, kind and wonderful human.” They deemed his involuntary holding unnecessary and a search of his apartment unwarranted. That is, authorities defined Rodger and assessed his intentions based upon face-to-face interaction, privileging this interaction over and above a “vast digital trail.” This is digital dualism taken to its worst imaginable conclusion.
- Eryk Salvaggio at Like Fish. posted a thorough analysis of Rodger’s manifesto looking at how women function as objects and symbols in the text:
In fact, in the entire 140-odd-page memoir he left behind, “My Twisted World,” documents with agonizing repetition the daily tortured minutiae of his life, and barely has any interactions with women. What it has is interactions with the symbols of women, a non-stop shuffling of imaginary worlds that women represented access to. Women weren’t objects of desire per se, they were currency.
What exists in painstaking detail are the male figures in his life. The ones he meets who then reveal that they have kissed a girl, or slept with a girl, or slept with a few girls. These are the men who have what Elliot can’t have, and these are the men that he obsesses over.
Women don’t merely serve as objects for Elliot. Women are the currency used to buy whatever he’s missing. Just as a dollar bill used to get you a dollar’s worth of silver, a woman is an indicator of spending power. He wants to throw this money around for other people. Bring them home to prove something to his roommates. Show the bullies who picked on him that he deserves the same things they do.
There’s another, slightly more obscure recurring theme in Elliot’s manifesto: The frequency with which he discusses either his desire or attempt to throw a glass of some liquid at happy couples, particularly if the girl is a ‘beautiful tall blonde.’ […] These are the only interactions Elliot has with women: marking his territory.
When we don’t know how else to say what we need, like entitled children, we scream, and the loudest scream we have is violence. Violence is not an act of expressing the inexpressible, it’s an act of expressing our frustration with the inexpressible. When we surround ourselves by closed ideology, anger and frustration and rage come to us when words can’t. Some ideologies prey on fear and hatred and shift them into symbols that all other symbols are defined by. It limits your vocabulary.
- Some of these analyses recall Douglas Kellner’s take on school shootings as crises of masculinity:
While the motivations for the shootings may vary, they have in common crises in masculinity in which young men use guns and violence to create ultra-masculine identities as part of a media spectacle that produces fame and celebrity for the shooters.
Crises in masculinity are grounded in the deterioration of socio-economic possibilities for young men and are inflamed by economic troubles. Gun carnage is also encouraged in part by media that repeatedly illustrates violence as a way of responding to problems. Explosions of male rage and rampage are also embedded in the escalation of war and militarism in the United States from the long nightmare of Vietnam through the military interventions in Afghanistan and Iraq.
- Influenced by Debord, Kellner used the term “spectacle” in discussing the role of media coverage in events like rampage shootings:
For Debord, “spectacle” constituted the overarching concept to describe the media and consumer society, including the packaging, promotion, and display of commodities and the production and effects of all media. Using the term “media spectacle,” I am largely focusing on various forms of technologically-constructed media productions that are produced and disseminated through the so-called mass media, ranging from radio and television to the Internet and the latest wireless gadgets.
- Kellner’s comments from a 2008 interview talking about the Virginia Tech shooter’s videos broadcast after the massacre, and his comments on critical media literacy, remain relevant to the current situation:
Cho’s multimedia video dossier, released after the Virginia Tech shootings, showed that he was consciously creating a spectacle of terror to create a hypermasculine identity for himself and avenge himself to solve his personal crises and problems. The NIU shooter, dressed in black emerged from a curtain onto a stage and started shooting, obviously creating a spectacle of terror, although as of this moment we still do not know much about his motivations. As for the television networks, since they are profit centers in a highly competitive business, they will continue to circulate school shootings and other acts of domestic terrorism as “breaking events” and will constitute the murderers as celebrities. Some media have begun to not publicize the name of teen suicides, to attempt to deter copy-cat effects, and the media should definitely be concerned about creating celebrities out of school shooters and not sensationalize them.
People have to become critical of the media scripts of hyperviolence and hypermasculinity that are projected as role models for men in the media, or that help to legitimate violence as a means to resolve personal crises or solve problems. We need critical media literacy to analyze how the media construct models of masculinities and femininities, good and evil, and become critical readers of the media who ourselves seek alternative models of identity and behavior.
- Almost immediately after news of the violence broke, and word of the killer’s YouTube videos spread, there was a spike of online backlash against the media saturation and warnings against promoting the perpetrator to celebrity status through omnipresent news coverage. Just two days after the killings Isla Vista residents and UCSB students let the news crews at the scene know that they were not welcome to intrude upon the community’s mourning. As they are wont to do, journalists reported on their role in the story while ignoring the wishes of the residents, as in this LA Times brief:
More than a dozen reporters were camped out on Pardall Road in front of the deli — and had been for days, their cameras and lights and gear taking up an entire lane of the street. At one point, police officers showed up to ensure that tensions did not boil over.
The students stared straight-faced at reporters. Some held signs expressing their frustration with the news media:
“OUR TRAGEDY IS NOT YOUR COMMODITY.”
“Remembrance NOT ratings.”
“Stop filming our tears.”
“Let us heal.”
“NEWS CREWS GO HOME!”
- Commemorating the 25th anniversary of the publication of his infamous essay, “The End of History?“, Francis Fukuyama wrote an essay for the Wall Street Journal reflecting on how the world has changed since he declared the end of history:
I argued that History (in the grand philosophical sense) was turning out very differently from what thinkers on the left had imagined. The process of economic and political modernization was leading not to communism, as the Marxists had asserted and the Soviet Union had avowed, but to some form of liberal democracy and a market economy. History, I wrote, appeared to culminate in liberty: elected governments, individual rights, an economic system in which capital and labor circulated with relatively modest state oversight.
So has my end-of-history hypothesis been proven wrong, or if not wrong, in need of serious revision? I believe that the underlying idea remains essentially correct, but I also now understand many things about the nature of political development that I saw less clearly during the heady days of 1989.
Twenty-five years later, the most serious threat to the end-of-history hypothesis isn’t that there is a higher, better model out there that will someday supersede liberal democracy; neither Islamist theocracy nor Chinese capitalism cuts it. Once societies get on the up escalator of industrialization, their social structure begins to change in ways that increase demands for political participation. If political elites accommodate these demands, we arrive at some version of democracy.
- An article by Eliane Glaser in The Guardian considers whether Fukuyama’s hypothesis is a rightwing argument in disguise:
When he wrote “The End of History?”, Fukuyama was a neocon. He was taught by Leo Strauss’s protege Allan Bloom, author of The Closing of the American Mind; he was a researcher for the Rand Corporation, the thinktank for the American military-industrial complex; and he followed his mentor Paul Wolfowitz into the Reagan administration. He showed his true political colours when he wrote that “the class issue has actually been successfully resolved in the west … the egalitarianism of modern America represents the essential achievement of the classless society envisioned by Marx.” This was a highly tendentious claim even in 1989.
Fukuyama distinguished his own position from that of the sociologist Daniel Bell, who published a collection of essays in 1960 titled The End of Ideology. Bell had found himself, at the end of the 1950s, at a “disconcerting caesura”. Political society had rejected “the old apocalyptic and chiliastic visions”, he wrote, and “in the west, among the intellectuals, the old passions are spent.” Bell also had ties to neocons but denied an affiliation to any ideology. Fukuyama claimed not that ideology per se was finished, but that the best possible ideology had evolved. Yet the “end of history” and the “end of ideology” arguments have the same effect: they conceal and naturalise the dominance of the right, and erase the rationale for debate.
While I recognise the ideological subterfuge (the markets as “natural”), there is a broader aspect to Fukuyama’s essay that I admire, and cannot analyse away. It ends with a surprisingly poignant passage: “The end of history will be a very sad time. The struggle for recognition, the willingness to risk one’s life for a purely abstract goal, the worldwide ideological struggle that called forth daring, courage, imagination, and idealism, will be replaced by economic calculation, the endless solving of technical problems, environmental concerns, and the satisfaction of sophisticated consumer demands.”
- Late last year the International Forum for Democratic Studies interviewed Fukuyama about his article Democracy and the Quality of the State:
- Finally, the CATO Institute just held a conference where Fukuyama and several other scholars discussed “The End of History 25 Years Later”. Videos and podcasts of the panels are available at the conference site. Description of the conference and list of participants:
In an article that went viral in 1989, Francis Fukuyama advanced the notion that with the death of communism history had come to an end in the sense that liberalism — democracy and market capitalism — had triumphed as an ideology. Fukuyama will be joined by other scholars to examine this proposition in the light of experience during the subsequent quarter century.
Featuring Francis Fukuyama, author of “The End of History?”; Michael Mandelbaum, School of Advanced International Studies, Johns Hopkins University; Marian Tupy, Cato Institute; Adam Garfinkle, editor, American Interest; Paul Pillar, Nonresident Senior Fellow, Foreign Policy, Center for 21st Century Security and Intelligence, Brookings Institution; and John Mueller, Ohio State University and Cato Institute.
- The 2014 World Cup kicked off yesterday with a futuristic twist on the opening ceremonies. A paraplegic kicked a soccer ball using an exoskeleton designed by the Walk Again Project:
The exoskeleton — a system comprising a helmet implanted with a microchip that sticks out from the underside; a T-shirt loaded with sensors; metal leg braces; and a battery worn in a backpack — is set in motion when the user envisions himself making the kick. The chip translates those electronic commands to a digital language that powers the skeleton, which then moves accordingly. The T-shirt vibrates to enhance the user’s sensation of movement (and eliminate the need to look at his feet to see if he’s stepping forward).
- Unfortunately, as io9 reports, the moment was not well-covered by TV networks:
Talk about dropping the ball. Earlier today, Juliano Pinto — a 29 year-old paraplegic — successfully kicked off the 2014 FIFA World Cup by using a mind-controlled exoskeleton. But sadly, most TV networks failed to show it.
After months of hype, the official broadcast of the opening ceremonies showed only a fraction of it, while some TV networks missed the event altogether. Commentators criticized the organizers for casting aside the moment in favor of performing acts.
- Thomas Frey at the Futurist Speaker blog forecasts the coming AI crash wars:
The invasion of high-frequency trading machines is now forcing capitalism far away from anything either Adam Smith or the founders of the NYSE could possibly find virtuous.
We’re not about to let robots compete in the Olympics, driverless cars race in the Indianapolis 500, or automated machines play sports like football, basketball, or baseball. So why is it we allow them to play a role in the most valuable contest of all, the world wide stock exchange?
With crude forms of AI now entering the quant manipulator’s toolbox, we are now teetering dangerously close to a total collapse of the stock market, one that will leave many corporations and individuals financially destitute.
- Microsoft has announced their version of apple’s Siri virtual assistant. Named Cortana, after the AI character from the Halo video game series, she is coming to Windows smartphones, and as Brad Molen at engadget reports, developers programmed her with a distinct personality:
Confident, caring, competent, loyal; helpful, but not bossy: These are just some of the words Susan Hendrich, the project manager in charge of overseeing Cortana’s personality, used to describe the program’s most significant character traits. “She’s eager to learn and can be downright funny, peppering her answers with banter or a comeback,” Hendrich said. “She seeks familiarity, but her job is to be a personal assistant.” With that kind of list, it sure sounds like Hendrich’s describing a human. Which is precisely what she and her team set out to do during Cortana’s development; create an AI with human-like qualities.
Microsoft’s decision to infuse Cortana with a personality stemmed from one end goal: user attachment. “We did some research and found that people are more likely to interact with [AI] when it feels more human,” said Hendrich. To illustrate that desired human-machine dynamic, Hendrich pointed to her grandmother’s experience with a Roomba vacuum: “She gave a name and a personality to an inanimate object, and it brought her joy.” That sense of familiarity is exactly what Microsoft wants Window Phone users to feel when interacting with Cortana on their own devices.
- Ernesto Laclau, post-Marxist critical theorist and significant figure in discourse analysis (along with his wife and collaborator Chantal Mouffe), died on April 13.
- An obituary by British historian and academic Robin Blackburn was posted on the Verso web site:
Ernesto and Chantal used the work of Antonio Gramsci to reject what they saw as the reductionism and teleology of much Marxist theory. Though sometimes calling himself a ‘post-Marxist’ and an advocate of ‘radical democracy’, Ernesto insisted that he remained a radical anti-imperialist and anti-capitalist. His criticisms of Marx and Marxism were made in a constructive spirit, and without a hint of rancour.
Ernesto was recognised as leading thinker in Latin America but also as an intellectual star in the academic world, co-authoring Contingency, Hegemony and Universality with Slavoj Žižek and Judith Butler in 2008. He gave courses at a string of leading universities in Europe and the Americas, including North Western and the New School for Social Research. Ernesto became Emeritus professor at Essex in 2003, but the Centre he established continues its work.
- Blackburn also penned an article on Laclau that was published by The Guardian:
With collaborators including his wife, Chantal Mouffe, and the cultural theorist Stuart Hall, Laclau played a key role in reformulating Marxist theory in the light of the collapse of communism and failure of social democracy. His “post-Marxist” manifesto Hegemony and Socialist Strategy (1985), written with Mouffe, was translated into 30 languages, and sales ran into six figures. The book argued that the class conflict identified by Marx was being superseded by new forms of identity and social awareness. This worried some on the left, including Laclau’s friend Ralph Miliband, who feared that he had lost touch with the mundane reality of class division and conflict, but his criticisms of Marx and Marxism were always made in a constructive spirit.
Political populism was an enduring fascination for Laclau. His first book, Politics and Ideology in Marxist Theory (1977), offered a polite but devastating critique of the conventional discourse on Latin America at the time. This “dependency” approach tended to see the large landowners – latifundistas – as semi-feudal and pre-capitalist, while Laclau showed them to be part and parcel of Latin American capitalism which fostered enormous wealth and desperate poverty.
- Matthew Reisz wrote a remembrance for Times Higher Education:
Witnessing the impact of the Perónist movement in Argentina led Professor Laclau to a fascination with populism. He wrote a celebrated essay on the subject in the 1970s and then a full-length book, On Populist Reason (2005), looking at the rise of leftist politicians such as Hugo Chávez across much of Latin America. Both the current president of Argentina, Cristina Fernández de Kirchner, and her late husband and predecessor Néstor Kirchner, are said to have been great admirers of his work.
- Ryan Brading wrote a more personal reflection on Laclau’s influence on his own research trajectory:
Laclau’s theory of populism has played a critical role in my research. Without his theoretical insights and captivating character, I could not have expanded my initial observations of populist practices to this level. Beside his theoretical legacy and rich intellectual input outside academia, Prof. Laclau also contributed to the training and development of students and researchers from different parts of the world – thanks to the IDA programme he founded. His death is a great loss.
- Laclau’s last book, The Rhetorical Foundations of Society, was published last week by Verso.
- I’ve long been fascinated by the gaming culture in South Korea, and Tom Massey has written a great feature piece for Eurogamer titled Seoul Caliber: Inside Korea’s Gaming Culture. From this westerner’s perspective, having never visited Korea, the article reads almost more like cyberpunk fiction than games journalism:
Not quite as ubiquitous, but still extremely common, are PC Bangs: LAN gaming hangouts where 1000 Won nets you an hour of multiplayer catharsis. In Gangnam’s Maxzone, overhead fans rotate at Apocalypse Now speed, slicing cigarette smoke as it snakes through the blades. Korea’s own NCSoft, whose European base is but a stone’s throw from the Eurogamer offices, is currently going strong with its latest MMO, Blade & Soul.
“It’s relaxing,” says Min-Su, sipping a Milkis purchased from the wall-mounted vending machine. “And dangerous,” he adds. “It’s easy to lose track of time playing these games, especially when you have so much invested in them. I’m always thinking about achieving the next level or taking on a quick quest to try to obtain a weapon, and the next thing I know I’ve been here for half the day.”
- As a cyberpunk/hyperreality aside, the city of Hong Kong has put up a blue-sky backdrop for when the real sky is too smoggy for tourist photos.
- In yet another cyberpunk dystopian tangent, I recently came across Chris Rogers’ site Fragments of a Hologram Rose: Re-seeing Blade Runner, with an assortment of content and analysis relating to the film.
- And one final cyberpunk diversion: this video is the first part of a lecture by University of Michigan professor Eric Rabkin covering cyberpunk, postmodernism, and beyond:
- Writing for The New Economy, Aaran Franda examines how the virtual economies seen in games like EVE Online provide valuable perspectives on real world economic activity:
Creation and simulation in virtual worlds appear to offer the best domain to test the new ideas required to tackle the very real problems of depravation, inequality, unemployment, and poverty that exist in national economies. On that note the need to see our socioeconomic institutions for the games that they really are seems even more poignant.
In the words of Vili Lehdonvirta, a leading scholar in virtual goods and currencies, the suffering we see today is “not some consequence of natural or physical law” it instead “is a result of the way we play these games.”
- Jon Evans at Tech Crunch looks at jobs, robots, capitalism, inequality, and you:
The global economy seems to be bifurcating into a rich/tech track and a poor/non-tech track, not least because new technology will increasingly destroy/replace old non-tech jobs. (Yes, global. Foxconn is already replacing Chinese employees with one million robots.) So far so fairly non-controversial.
The big thorny question is this: is technology destroying jobs faster than it creates them?
We live in an era of rapid exponential growth in technological capabilities. (Which may finally be slowing down, true, but that’s an issue for decades hence.) If you’re talking about the economic effects of technology in the 1980s, much less the 1930s or the nineteenth century, as if it has any relevance whatsoever to today’s situation, then you do not understand exponential growth. The present changes so much faster that the past is no guide at all; the difference is qualitative, not just quantitative. It’s like comparing a leisurely walk to relativistic speeds.
- This recent episode of Radiolab focused on talking to machines:
We begin with a love story–from a man who unwittingly fell in love with a chatbot on an online dating site. Then, we encounter a robot therapist whose inventor became so unnerved by its success that he pulled the plug. And we talk to the man who coded Cleverbot, a software program that learns from every new line of conversation it receives…and that’s chatting with more than 3 million humans each month. Then, five intrepid kids help us test a hypothesis about a toy designed to push our buttons, and play on our human empathy. And we meet a robot built to be so sentient that its creators hope it will one day have a consciousness, and a life, all its own.
- This video shows a demo of using Google Glass for interactive augmented reality:
- A recent Guardian article by Juliette Garside warns that our digital infrastructure is exceeding the limits of human control:
“These outages are absolutely going to continue,” said Neil MacDonald, a fellow at technology research firm Gartner. “There has been an explosion in data across all types of enterprises. The complexity of the systems created to support big data is beyond the understanding of a single person and they also fail in ways that are beyond the comprehension of a single person.”
From high volume securities trading to the explosion in social media and the online consumption of entertainment, the amount of data being carried globally over the private networks, such as stock exchanges, and the public internet is placing unprecedented strain on websites and on the networks that connect them.
- In an “anti-videogame manifesto,” Keith Burgun argues for intrinsic rewards and against grinding in videogames:
What I want is systems that have intrinsic rewards; that are disciplines similar to drawing or playing a musical instrument. I want systems which are their own reward.
What videogames almost always give me instead are labor that I must perform for an extrinsic reward. I want to convince you that not only is this not what I want, this isn’t really what anyone wants.
- This video from PBS Digital Studios’ Off Book looks at the rise of competetive gaming & e-sports:
- Will Luton at GamesIndustry International writes about the celebrification of game developers:
This ‘celebrification’ is enlivening making games and giving players role models, drawing more people in to development, especially indie and auteured games. This shift is proving more prosperous than any Skillset-accredited course or government pot could ever hope for. We are making men sitting in pants at their laptops for 12 hours a day as glamorous as it could be.
Creating luminaries will lead to all the benefits that more people in games can bring: a bigger and brighter community, plus new and fresh talent making exciting games. However, celebritydom demands storms, turmoil and gossip.
- The ongoing survey of Hollywood’s Summer of Doom continues with Isaac Chotiner’s New Republic article, Hollywood is in trouble and we’re all going to pay:
Spielberg’s theory is essentially that a studio will eventually go under after it releases five or six bombs in a row. The reason: budgets have become so gigantic. And, indeed, this summer has been full of movies with giant budgets and modest grosses, all of which has elicited hand-wringing about financial losses, the lack of a quality product (another post-apocalyptic thriller? more superheroes?), and a possible connection between the two. There has been some hope that Hollywood’s troubles will lead to a rethinking of how movies get made, and which movies get greenlit by studio executives. But a close look at this summer’s grosses suggest a more worrisome possibility: that the studios will become more conservative and even less creative.
- Finally, video of Slavoj Žižek and Paul A. Taylor discussing the difficulty of conveying philosophical ideas in today’s media:
- I’ve never played EVE Online, and I don’t even really understand how it works, but I find it fascinating. Last week saw the biggest battle in the game’s history. This breakdown from The Verge is headlined like a real-life dispatch from the frontier of mankind’s space-faring endeavors: Largest space battle in history claims 2,900 ships, untold virtual lives
Update, 9:18PM ET: The battle is over. After more than five hours of combat, the CFC has defeated TEST Alliance. Over 2,900 ships were destroyed today in the largest fleet battle in Eve Online’s history. TEST Alliance intended to make a definitive statement in 6VDT, but their defeat at the hands of the CFC was decisive and will likely result in TEST’s withdrawal from the Fountain region.
- Also last week, Microsoft confirmed that the retail version of the Xbox One will function as developer kits and the Xbox Live Arcade will allow independent developers to self-publish games. From the Game Informer article:
In a conversation with Whitten, he told us that the commitment to independent developers is full. There won’t be restrictions on the type of titles that can be created, nor will there be limits in scope. In response to a question on whether retail-scale games could be published independently, Whitten told us, “Our goal is to give them access to the power of Xbox One, the power of Xbox Live, the cloud, Kinect, Smartglass. That’s what we think will actually generate a bunch of creativity on the system.” With regard to revenue splitting with developers, we were told that more information will be coming at Gamescom, but that we could think about it “generally like we think about Marketplace today.” According to developers we’ve spoken with, that split can be approximately 50-50.
- Kris Ligman at Gama Sutra reports that self published games will also be available on Xbox 360. This HuffPo post aggregates links and provides an overview of the Xbox One/Self publishing story.
Another difference between the Xbox One and Xbox 360 is how the games will be published and bought by other gamers. Indie games will not be relegated to the Xbox Live Indie Marketplace like on the Xbox 360 or required to have a Microsoft-certified publisher to distribute physically or digitally outside the Indie Marketplace. All games will be featured in one big area with access to all kinds of games.
- DevWithTheHair argues that freemium is hurting modern video game design:
If anything has hurt modern video game design over the past several years, it has been the rise of ‘freemium‘. It seems that it is rare to see a top app or game in the app stores that has a business model that is something other than the ‘free-to-play with in-app purchases’ model. It has been used as an excuse to make lazy, poorly designed games that are predicated on taking advantage of psychological triggers in its players, and will have negative long term consequences for the video game industry if kept unchecked.
Many freemium games are designed around the idea of conditioning players to become addicted to playing the game. Many game designers want their games to be heavily played, but in this case the freemium games are designed to trigger a ‘reward’ state in the player’s brain in order to keep the player playing (and ultimately entice the user to make in-app purchases to continue playing). This type of conditioning is often referred to as a ‘Skinner box‘, named after the psychologist that created laboratory boxes used to perform behavioral experiments on animals.
- Grey Matter Gaming posted a thorough piece as the first part of a series on The Publisher-Developer Money-Go-Round and the Boom of the Indie Industry:
It obviously isn’t beyond the realm of possibility that, not only do financial considerations influence a game’s structure and content, financial outcomes affect a studio’s likelihood of survival in the industry, based upon the machinations of its publishing overlords. Activision killed Bizarre Creations, Eidos ruined Looking Glass Studios, EA crushed Westood, Pandemic, Bullfrog, Origin Systems… well, the list could go on, until I turn a strange, purple color, but you get my point. And, when 3.4 million copies sold for a Tomb Raider reboot isn’t enough by a publisher’s standards, you can’t help but feel concern for a developer’s future.
- Doctoral student Stephen Slota writes how video games can enhance learning and problem-solving:
This relationship between environment-learner-content interaction and transfer puts teachers in the unique position to capitalize on game engagement to promote reflection that positively shapes how students tackle real-world challenges. To some, this may seem like a shocking concept, but it’s definitely not a new one—roleplay as instruction, for example, was very popular among the ancient Greeks and, in many ways, served as the backbone for Plato’s renowned Allegory of the Cave. The same is true of Shakespeare’s works, 18th and 19th century opera, and many of the novels, movies, and other media that define our culture. More recently, NASA has applied game-like simulations to teach astronauts how to maneuver through space, medical schools have used them to teach robotic surgery, and the Federal Aviation Administration has employed them to test pilots.
- Alex Law at Nightmare Mode posted a great article titled Player-Character Dynamics, Identity, and Sexuality in Video Games:
The relationship between the creator, the product, and the audience, are all important contexts to consider during media analysis, especially with games. This is because the audience is an active participant in the media. So if you are creating a game you always have to keep in mind the audience. Even if you say the audience doesn’t matter to you, it won’t cease to exist, and it does not erase the impact your game will have.
Similarly, if you are critiquing or analyzing any media, you can’t ignore the creator and the creator’s intentions. Despite those who claim the “death of the author,” if the audience is aware of the creator’s intentions, it can affect how they perceive the game. Particularly, if you consider the ease in which creators can release statements talking about their work, you’ll have an audience with varying levels of awareness about the creator’s intentions. These factors all play off of each other–they do not exist in a vacuum.
- DROP OUT. HANG OUT. SPACE OUT. chimes in on Warren Spector’s call for better games criticism:
When we talk about any medium’s legitimacy, be it film or videogames or painting, it’s a very historical phenomenon that is inextricably tied to its artness that allows for them to get in on the ground floor of “legitimate” and “important.” So if we contextualize the qualities that allowed for film or photography to find themselves supported through a panoply of cultural institutions it was a cultural and political economic process that lead them there.
Videogames, the kind that would be written about in 20 dollar glossy art magazines, would be exactly this. When creators of videogames want to point to their medium’s legitimacy, it would help to have a lot of smart people legitimate your work in a medium (glossy magazines, international newspapers) that you consider to be likewise legitimate. Spector concedes that ‘yes all the critics right now are online’, but the real battle is in getting these critics offline and into more “legitimate” spaces of representation. It’s a kind of unspoken hierarchy of mediums that is dancing before us here: at each step a new gatekeeper steps into play, both legitimating and separating the reader from the critic and the object of criticism.
- Pop Matters looks at how fatherhood is represented in Heavy Rain, The Walking Dead, and The Last of Us:
All three games define fatherhood around the act of protection, primarily physical protection. And in each of these games, the protagonist fails—at least temporarily—to protect their ward. In Ethan’s case, his cheery family reflected in his pristine home collapses when he loses a son in a car accident. Later, when his other son goes missing, the game essentially tests Ethan’s ability to reclaim his protective-father status.
- Sam Barsanti at the The Gameological Society explains how the much-derided final act of BioShock actually drives home one of its most important themes:
No video game grants absolute freedom; they all have rules or guidelines that govern what you can and can’t do. The sci-fi epic Mass Effect is a series that prides itself on choice, but even that trilogy ends on a variation of choosing between the “good” and “bad” ending. Minecraft, the open-world creation game, is extremely open-ended, but you can’t build a gun or construct a tower into space because it doesn’t let you. BioShock’s ending argues that the choices you think you’re making in these games don’t actually represent freedom. You’re just operating within the parameters set by the people in control, be they the developers or the guy in the game telling you to bash his skull with a golf club.
BioShock’s disappointing conclusion ends up illustrating Ryan’s point. A man chooses, a player obeys. It’s a grim and cynical message that emphasizes the constraints of its own art form. And given that the idea of choice is so important to BioShock’s story, I don’t think it could’ve ended any other way.
- The latest “this week in videogame blogging” post at Critical Distance includes The Last of Us as exercise in emotional manipulation, analyzing Saints Row the Third through the lens of thematic self-sabotage, and a “Let’s Critique” commentary for Dishonored.
- William Saletan at Slate shows how media coverage has misrepresented Juror B29’s comments on the Zimmerman trial verdict:
The reports are based on an ABC News interview with Juror B29, the sole nonwhite juror. She has identified herself only by her first name, Maddy. She’s been framed as the woman who was bullied out of voting to convict Zimmerman. But that’s not true. She stands by the verdict. She yielded to the evidence and the law, not to bullying. She thinks Zimmerman was morally culpable but not legally guilty. And she wants us to distinguish between this trial and larger questions of race and justice.
ABC News hasn’t posted a full unedited video or transcript of the interview. The video that has been broadcast—on World News Tonight, Nightline, and Good Morning America—has been cut and spliced in different ways, often so artfully that the transitions appear continuous. So beware what you’re seeing. But the video that’s available already shows, on closer inspection, that Maddy has been manipulated and misrepresented. Here are the key points.
- This follows Zimmerman filing suit against NBC for defamation:
In the recording heard by NBC viewers, Zimmerman appeared to volunteer the information, “This guy looks like he’s up to no good. He looks black.”
Edited out was the 911 dispatcher asking Zimmerman if the person he was suspicious of was “black, white or Hispanic,” to which Zimmerman had responded, “He looks black.”
- John Nolte at Breitbart thinks that CNN’s coverage of the Zimmerman case establishes the network as “the most disgraced name in news”:
Though Zimmerman and his attorneys have filed a lawsuit against NBC News for the malicious editing of the 911 tape, what CNN did is far worse.
NBC News was attempting to make Zimmerman look like a racial profiler. CNN, on the other hand, was attempting to make Zimmerman look like an enraged outright racist (there was no racial angle in ABC’s fraud). It also took CNN far longer to retract their story than either NBC or ABC.
Moreover, on its own airwaves, CNN would allow the complete fallacy that Zimmerman had said “fucking coon” to live on.
- Dan Laughey offers an idiosyncratically British perspective on “royal baby” media coverage:
Pulling teeth doesn’t do justice to the painful viewing experience accompanying this sort of news manufacture – making news from no news. Even the daily palaver known as Changing the Guard was spun to look like an integral prelude to the long-awaited arrival. And the waiting went on, and on, and on, and the longer it went on, the more desperate and dull the coverage became. Sometimes people complain about the high salaries enjoyed by news presenters, especially the public service variety, but by golly they earnt their crust trying, albeit failing, to sustain the suspense.
- In the New York Review of Books, Martin Scorcese discusses “reading the language of cinema”:
Light is at the beginning of cinema, of course. It’s fundamental—because cinema is created with light, and it’s still best seen projected in dark rooms, where it’s the only source of light. But light is also at the beginning of everything. Most creation myths start with darkness, and then the real beginning comes with light—which means the creation of forms. Which leads to distinguishing one thing from another, and ourselves from the rest of the world. Recognizing patterns, similarities, differences, naming things—interpreting the world. Metaphors—seeing one thing “in light of” something else. Becoming “enlightened.” Light is at the core of who we are and how we understand ourselves.
Or consider the famous Stargate sequence from Stanley Kubrick’s monumental 2001: A Space Odyssey. Narrative, abstraction, speed, movement, stillness, life, death—they’re all up there. Again we find ourselves back at that mystical urge—to explore, to create movement, to go faster and faster, and maybe find some kind of peace at the heart of it, a state of pure being.
- The Guardian provides an update on Hollywood’s summer of doom:
Despite stormy forecasts, Hollywood appears to be too unwieldly or too unwilling to shift direction towards smaller, cheaper pictures. Guests at Comic-Con learned about upcoming studio productions including Pirates of the Caribbean 5, Thor 2, Fantastic Four 3 and a reboot of Godzilla. The director Joss Whedon came to the event to lament that “pop culture is eating itself” and called for “new universes, new messages and new icons”. He then revealed the title of his next film to be Avengers: Age of Ultron.
- Also in the Guardian, John Naughton writes that Edward Snowden is not the story:
Repeat after me: Edward Snowden is not the story. The story is what he has revealed about the hidden wiring of our networked world. This insight seems to have escaped most of the world’s mainstream media, for reasons that escape me but would not have surprised Evelyn Waugh, whose contempt for journalists was one of his few endearing characteristics. The obvious explanations are: incorrigible ignorance; the imperative to personalise stories; or gullibility in swallowing US government spin, which brands Snowden as a spy rather than a whistleblower.
- Lauren Granger at memeburn reports on YouTube’s first ever Geek Week:
The video site is aiming to showcase some geek culture by pronouncing 4-10 August its first ever ‘Geek Week’ and promoting some of the genre’s top channels which cover everything from sci-fi to comics, gaming and superheroes. To do this, its own channel will be featuring videos from users like Nerdist, the official Doctor Who channel, MinutePhysics and more than a hundred others, with every day of the week hosted by a different user. It’ll even include the first trailer for the new Thor movie, The Dark World.
- Chris Cagle at Category D writes about the new documentary Blackfish and “the effaced spectator”:
That said, things kept nagging me. Blackfish does raise some valuable secondary issues – how SeaWorld markets itself, how labor issues are at stake in addition to environmental ones – but as a spectator I kept wanting the film to pursue lines of analysis that it would suggest but never develop.
In short, if there’s an ur-ideology to the American progressive documentary, it’s that demand-side drivers of political situations (Gramsci’s hegemony, ideology, what have you) don’t matter, it’s merely the supply side of oligopoly, big money, and corporate control. Or to be less political, as a film scholar I can’t help but notice than in a film about the business of spectacle, the spectator is both crucial (SeaWorld viewers provide the vital footage of the incidents) and completely effaced.
- Matthew Manarino at New Media Rockstars looks at 10 years of AdSense:
And what of the YouTube creator? How has AdSense helped or hindered their careers? In most cases, the advertising structure has been a blessing to creators as it’s allowed them to launch careers solely through YouTube. AdSense gave us a new type of celebrity for a new generation.
Creators have had their fair share of AdSense woes in the past, though. Last year, one of YouTube’s biggest names, Ray William Johnson,entered a very public dispute with Maker Studios. Johnson claimed that Maker Studios was holding his AdSense account “hostage” even after he had terminated his contract with them.
- Scott Nye, a Rogerebert.com contributor, has just discovered the lampshade hanging trope:
If you watch big budget entertainments, there’s no escaping these sorts of moments. The trope familiar to the Scooby-Doo generation, in which a few nagging uncertainties are resolved with a “there’s just one thing I don’t understand” kickoff, has now become a motif. Characters must constantly address questions on behalf of a too-curious audience awash in complexly-plotted mega-stories. The movies are trying to plug leaks in a boat before the whole thing sinks—never quite repairing it, but doing just enough to get by.
- Here is the TV Tropes page on Lampshade Hanging.
- Cyborgology contributor Britney Summit-Gil writes about remediation and violence against women in the Game of Thrones tv series:
What I’m talking about here is the unavoidable shift that occurs when content is remediated—that is, borrowed from one medium and reimagined in another. In this case, the content of the book series A Song of Ice and Fire (ASOIAF) is remediated to Game of Thrones, the HBO television series. Some of the differences in this instance of remediation seem pragmatic—remembrances are turned into scenes of their own, dialogue is shortened, characters omitted or altered for the sake of brevity and clarity. I am no purist, and I recognize that with remediation comes necessary alteration for the content to suit the new medium. But other differences speak volumes about our cultural biases and expectations surrounding those with socially-othered bodies—like Tyrion, Sam, and, of course, women. What can we say about these differences? And perhaps more importantly, what do they say about us?
- At BFI Nick Wrigley posted a look at some of Stanley Kubrick’s favorite films, with insight from Jan Harlan:
Why does it matter what Kubrick liked? For years I’ve enjoyed unearthing as much information as I can about his favourite films and it slowly became a personal hobby. Partly because each time I came across such a film (usually from a newly disclosed anecdote – thanks internet! – or Taschen’s incredible The Stanley Kubrick Archives book) I could use it as a prism to reveal more about his sensibilities. My appreciation of both him and the films he liked grew. These discoveries led me on a fascinating trail, as I peppered them throughout the 11 existing Kubrick features (not counting the two he disowned) I try to watch every couple of years. I’m sure a decent film festival could be themed around the Master List at the end of this article…