Tagged: criticaltheory

City space and emotion: Affect as urban infrastructure

For a change of pace this week, I thought I’d write about affect in relation to the urban condition. Specifically I am going to focus on Nigel Thrift’s chapters on spatialities of feeling from his book Non-representational Theory: Space, Politics, Affect. Thrift begins the first chapter by characterizing cities as “maelstroms of affect,” and asserting the “utter ubiquity of affect as a vital element of cities” (p. 171). Thrift questions why “the affective register” has not formed “a large part of the study of cities,” and states “to read about affect in cities it is necessary to resort to the pages of novels, and the tracklines of poems” (p. 171).

I have to question what Thrift means by “the study of cities,” particularly in relation to the history of urban sociology. There is a lengthy history in this tradition of studying the affective register of cities, from Weber’s anomie and Simmel’s blasé attitude, through the emergence of modern criminology and social scientific studies of urban anxiety and the fear of crime.

There are, of course, prolific approaches available for studying cities. In addition to approaches from fiction and prose, and the aforementioned social scientific methods, there abound philosophical, psychogeographic, and theological engagements with urban life. One approach to the study of cities that has been especially amenable to the affective register is the domain of urban design and planning. Practitioners and commentators from this realm (who often, erroneously and unfortunately, mistake their practice for urbanism entire) have long used affective language to describe and design urban spaces: happy streets, friendly spaces, menacing buildings, etc.

Thrift is not explicitly discussing “smart” urbanization projects, but of course much of the analysis across these two chapters is directly applicable to such initiatives. Shockingly, Ernst Bloch also says much of relevance to smart cities in his 1929 essay “The Anxiety of the Engineer”. Thrift’s summation of Bloch’s “apocalyptic” vision of cities from that essay reads like a ripped-from-the-headlines encapsulation of contemporary urbanization trends: “Transfixed by the idea of a totally safe and calculable environment, the capitalist city is fixed and unbending in the face of unexpected events: ‘it has rooted itself in midair’” (p. 198). It’s a fantastic connection to make, though I despair at my ever-growing reading list.

Lastly, I want to touch upon Thrift’s discussion of the misanthropic city. My first reaction was to respond that cities aren’t misanthropic, people are; but then I recalled my recent trip to Las Vegas. Returning to the affective register of urban design, I must say that Vegas is certainly a misanthropic city. It is a city built for money, not for people. To the extent that it is built for people, it is designed not to affirm or edify humanity’s highest qualities, but is rather constructed to amplify our basest and most animalistic aspects. Compulsion, lechery, and stupefaction are the human attributes “celebrated” in that space. From an urban design perspective, Las Vegas is among the most misanthropic of cities.

Of course, Thrift is not referring to misanthropic urban design (although the invocation of infrastructure is an interesting, and perhaps fecund, reference point for urban affect), but to misanthropic attitudes and behaviors among urban denizens. I do not ascribe to calls for kindness and idealized sense of community in the city, as I find they are often simplistic and embarrassingly maudlin. Indeed, the disconnectedness and universal strangeness that has long been decried as manifestations of the inherent disharmony of urban life, are in fact principal among the reasons that I love life in the metropolis. Nevertheless, I do appreciate that amidst the anxiety and imminent catastrophe of urban life, Thrift finds spaces for kindness and hope.

Thoughts on polemics, Audre Lorde, and Do the Right Thing

Radical black feminist writer and activist Audre Lorde found productive potential in anger. According to Lester Olson, in his article “Anger among allies”: “Lorde distinguished between anger and hatred, and she salvaged the former as potentially useful and generative” (p. 287). Lorde’s distinction between anger and hatred is developed in a quote from her remarks: “Hatred is the fury of those who do not share our goals, and its object is death and destruction. Anger is a grief of distortions between peers, and its object is change” (p. 298).

In a quote from her address titled “The Uses of Anger,” Lorde uses the metaphor of the virus to describe hatred:

“We are working in a context of oppression and threat, the cause of which is certainly not the angers which lie between us, but rather that virulent hatred leveled against all women, people of Color, lesbians and gay men, poor people – against all of us who are seeking to examine the particulars of our lives as we resist our oppressions, moving toward coalition and effective action.” (emphasis added)

This thematic link between hatred and disease is also present in Spike Lee’s film Do the Right Thing. While the film’s characters never state the distinction between anger and hatred as explicitly as Lorde does, the film makes many associations that establish a difference between the two. The action of the film takes place in a roughly 24 hour period, during the hottest day of the summer in Brooklyn, New York. The temperature is referenced throughout the film, and the link between the heat and character’s emotions is made early on. Anger is associated with heat: characters talk about “getting hot” as a euphemism for getting angry. By extension then, the hottest day of the summer could also be understood as the angriest.

Hatred, on the other hand, is continually linked with sickness and disease. Early in the film, when pizzeria owner Sal arrives with his two sons to start business for the day, his son Pino says of the pizza shop:

“I detest this place like a sickness.”

Sal admonishes his son, saying: “That sounds like hatred.”

This connection returns at the end of film, again in front of Sal’s Famous Pizzeria, which at this point has been reduced to a smoldering shell. Mookie seeks Sal out to ask for the wages he is due from the previous week’s labor. Angrily, Sal throws $500 in $100 bills at Mookie, twice as much as he is owed. Mookie leaves $200 on the ground, telling Sal that he only wants what he has earned. There is a stalemate as the two men stare off, the $200 between them, and each of them waiting for the other to pick it up. Apparently not understanding why Mookie would leave the money lying on the ground Sal asks him:

“Are you sick?”

Mookie replies: “I’m hot as a motherfucker; I’m alright, though.”

Mookie’s response here should not be understood merely as a comment about the weather. Yes, he is hot because of the summer heat, but the associations presented by the film make clear the deeper meaning of this exchange. Mookie is angry, angry as a motherfucker; having endured the ordeal of the hottest day of the summer, culminating in his throwing a trashcan through a shop window, and now he finds himself the following day with his various responsibilities still in place, but now without a source of income. But he does not hate Sal. He is not infected by hatred. He is not sick.

If the film associates hatred with sickness and disease, how does it relate or portray love? The radio DJ character, Mister Senor Love Daddy, seems like an obvious connection. Another important component is the name of Senor Love Daddy’s radio station: We Love Radio 108 (“Last on your dial, first in your heart.”). The name of the radio station not only presages Clear Channel Communications’ eventual rebranding to I Heart Radio (kidding, of course), it also establishes a connection between love and another of the film’s characters: Radio Raheem.

Radio Raheem is arguably the character most closely associated with the concepts of love and hate. Raheem has custom brass knuckles on each hand: the word “LOVE” on his right hand, and the word “HATE” on his left. Through the presence of these words on his knuckles, and his performance of the accompanying story about the struggle between love and hate, “the story of life,” Radio Raheem recalls Reverend Harry Powell from the 1955 film Night of the Hunter. Reverend Powell has the words “love” and “hate” tattooed on his knuckles: love on the right hand, and hate on the left. He also tells “the story of life,” which, although using different language than Raheem, tells essentially the same account of a struggle between hate and love, where hate has the upper hand for a while but is eventually beat out by love.

In Night of the Hunter, Reverend Powell’s performance of pious geniality conceals a dark secret: he is a serial killer, traveling the country seducing widows whom he soon murders before absconding with what wealth he can steal. In Do the Right Thing, Radio Raheem is not revealed to be a serial killer, but he is done in by a sort of serial killing: the recurring killing of men of color perpetrated by police officers. The characters of the film react to Raheem’s death in a personal way (“They killed Radio Raheem!”), but it is clearly also a reaction to this serial killing of black men that contributes to the crowd’s reaction (someone is heard exclaiming, “They did it again!”).

A final question: Is Do the Right Thing a polemic? I find it interesting to consider the question in light of the definitions offered by various authors. In her article on Larry Kramer’s polemical form, Erin Rand writes of polemics:

“Hence, polemics refute dominant ideologies and modes of thinking by rejecting the primacy of reason an invoking explicitly moral claims. In polemics a moral position is not simply advanced through rhetoric, but morality actually does rhetorical work.” (p. 305)

Rand traces the meaning of “polemic” to the Greek polemikos, meaning “warlike”,  and when Lee’s film was released many reviewers and commentators were concerned that it amounted to a call for violence. I am not sure the film satisfies Rand’s four elements of rhetorical form, but I do believe it satisfies the rhetorical move that Olson calls shifting subjectivities:

“An advocate articulates a shift in the second persona of an address, wherein the auditors or readers occupy one kind of role initially and then, drawing on what is remembered or learned from that position, are repositioned subsequently into a different role that is harder for them to recognize or occupy, but that might possess some transforming power.” (p. 284)

As film critic Roger Ebert recounted in an essay about the film:

“Many audiences are shocked that the destruction of Sal’s begins with a trash can thrown through the window by Mookie (Lee), the employee Sal refers to as “like a son to me.” Mookie is a character we’re meant to like. Lee says he has been asked many times over the years if Mookie did the right thing. Then he observes: “Not one person of color has ever asked me that question.” But the movie in any event is not just about how the cops kill a black man and a mob burns down a pizzeria. That would be too simple, and this is not a simplistic film. It covers a day in the life of a Brooklyn street, so that we get to know the neighbors, and see by what small steps the tragedy is approached.”

Some critics and audience members objected to what they interpreted as Lee’s call for violence, and at least an implicit approval of property destruction. We heard similar rhetoric last year, when protests in response to the deaths of Michael Brown and Eric Garner became characterized by media emphasis on incidents of property damage and looting. The state response to protests is always characterized by a tolerance so long as demonstrations are peaceful and “civil,” and when this line is broached it functions to demonize and dismiss the “protestors” at large. Is this not evocative of the white woman who purportedly said to Audre Lorde, “Tell me how you feel, but don’t say it too harshly or I cannot hear you”?

Memes, Enthymemes, and the Reproduction of Ideology

Zizek ideology meme

In his 1976 book The Selfish Gene, biologist Richard Dawkins introduced the word “meme” to refer to a hypothetical unit of cultural transmission. The discussion of the meme concept was contained in a single chapter of a book that was otherwise dedicated to genetic transmission, but the idea spread. Over decades, other authors further developed the meme concept, establishing “memetics” as a field of study. Today, the word “meme” has entered the popular lexicon, as well as popular culture, and is primarily associated with specific internet artifacts, or “viral” online content. Although this popular usage of the term is not always in keeping with Dawkins’ original conception, these examples from internet culture do illustrate some key features of how memes have been theorized.

This essay is principally concerned with two strands of memetic theory: the relation of memetic transmission to the reproduction of ideology; and the role of memes in rhetorical analysis, especially in relation to the enthymeme as persuasive appeal. Drawing on these theories, I will advance two related arguments: ideology as manifested in discursive acts can be considered to spread memetically; and ideology functions enthymemetically. Lastly, I will present a case study analysis to demonstrate how the use of methods and terminology from rhetorical criticism, discourse analysis, and media studies, can be employed to analyze artifacts based on these arguments.

Examples of memes presented by Dawkins include “tunes, ideas, catch-phrases, clothes fashions, ways of making pots or building arches” (p.192). The name “meme” was chosen due to its similarity to the word “gene”, as well as its relation to the Greek root “mimeme” meaning “that which is imitated” (p.192). Imitation is key to Dawkins’ notion of the meme because imitation is the means by which memes propagate themselves amongst members of a culture. Dawkins identifies three qualities associated with high survival in memes: longevity, fecundity, and copying-fidelity (p.194).

Distin (2005) further developed the meme hypothesis in The Selfish Meme. Furthering the gene/meme analogy, Distin defines memes as “units of cultural information” characterized by the representational content they carry (p.20), and the representational content is considered “the cultural equivalent of DNA” (p.37). This conceptualization of memes and their content forms the basis of Distin’s theory of cultural heredity. Distin then seeks to identify the representational system used by memes to carry their content (p.142). The first representational system considered is language, what Distin calls “the memes-as-words hypothesis” (p.145). Distin concludes that language itself is “too narrow to play the role of cultural DNA” (p.147).

Balkin (1998) took up the meme concept to develop a theory of ideology as “cultural software”. Balkin describes memes as “tools of understanding,” and states that there are “as many different kinds of memes as there are things that can be transmitted culturally” (p.48). Stating that the “standard view of memes as beliefs is remarkably similar to the standard view of ideology as a collection of beliefs” (p.49), Balkin links the theories of memetic transmission to theories of ideology. Employing metaphors of virility similar to how other authors have written of memes as “mind viruses,” Balkin considers memetic transmission as the spread of “ideological viruses” through social networks of communication, stating that “this model of ideological effects is the model of memetic evolution through cultural communication” (p.109). Balkin also presents a more favorable view of language as a vehicle for memes than Distin presented, writing: “Language is the most effective carrier of memes and is itself one of the most widespread forms of cultural software. Hence it is not surprising that many ideological mechanisms either have their source in features of language or are propagated through language” (p.175).

Balkin approaches the subject from a background in law, and although not a rhetorician and skeptical of the discursive turn in theories of ideology, Balkin does employ rhetorical concepts in discussing the influence of memes and ideology: “Rhetoric has power because understanding through rhetorical figures already forms part of our cultural software” (p.19). Balkin also cites Aristotle, remarking that “the successful rhetorician builds upon what the rhetorician and the audience have in common,” and “what the two have in common are shared cultural meanings and symbols” (p.209). In another passage, Balkin expresses a similar notion of the role of shared understanding in communication: “Much human communication requires the parties to infer and supplement what is being conveyed rather than simply uncoding it” (p.51).

Although Balkin never uses the term, these ideas are evocative of the rhetorical concept of the enthymeme. Aristotle himself discussed the enthymeme, though the concept was not elucidated with much specificity. Rhetorical scholars have since debated the nature of the enthymeme as employed in persuasion, and Bitzer (1959) surveyed various accounts to produce a more substantial definition. Bitzer’s analysis comes to focus on the enthymeme in relation to syllogisms, and the notion of the enthymeme as a syllogism with a missing (or unstated) proposition. Bitzer states: “To say that the enthymeme is an ‘incomplete syllogism’ – that is, a syllogism having one or more suppressed premises – means that the speaker does not lay down his premises but lets his audience supply them out of its stock of opinion and knowledge” (p.407).

Bitzer’s formulation of the enthymeme emphasizes that “enthymemes occur only when the speaker and audience jointly produce them” (p.408). That they are “jointly produced” is key to the role of the enthymeme is successful persuasive rhetoric: “Owing to the skill of the speaker, the audience itself helps construct the proofs by which it is persuaded” (p.408). That the enthymeme’s “premises are always drawn from the audience,” and the “successful construction is accomplished through the joint efforts of speaker and audience,” Bitzer defines as the “essential character” of the enthymeme. This joint construction, and supplying of the missing premise(s), resonates with Balkin’s view of the spread of cultural software, as well as various theories of subjects’ complicity in the functioning of ideology.

McGee (1980) supplied another link between rhetoric and ideology with the “ideograph”. McGee argued that “ideology is a political language composed of slogan-like terms signifying collective commitment” (p.15), and these terms he calls “ideographs”. Examples of ideographs, according to McGee, include “liberty,” “religion,” and “property” (p.16). Johnson (2007) applies the ideograph concept to memetics, to argue for the usefulness of the meme as a tool for materialist criticism. Johnson argues that although “the ideograph has been honed as a tool for political (“P”-politics) discourses, such as those that populate legislative arenas, the meme can better assess ‘superficial’ cultural discourses” (p.29). I also believe that the meme concept can be a productive tool for ideological critique. As an example, I will apply the concepts of ideology reproduction as memetic transmission, and ideological function as enthymematic, in an analysis of artifacts of online culture popularly referred to as “memes”.

As Internet culture evolved, users adapted and mutated the term “meme” to refer to specific online artifacts. Even though they may be considered a type of online artifact, Internet memes come in a variety of different forms. One of the oldest and most prominent series of image macro memes is the “LOLcats” series of memes. The template established by LOLcats of superimposing humorous text over static images became and remains the standard format for image macro memes. Two of the most prominent series of these types of memes are the “First World Problems” (FWP) and “Third World Success” image macros. Through analysis of these memes, it is possible to examine how the features of these artifacts and discursive practices demonstrate many of the traits of memes developed by theorists, and how theories of memetic ideological transmission and enthymematic ideological function can be applied to examine ideological characteristics of these artifacts.

References

Balkin, J.M. (1998). Cultural software: A theory of ideology. Dansbury, CT: Yale

University Press.

Bitzer, L. F. (1959). Aristotle’s enthymeme revisited. Quarterly Journal Of Speech,

45(4), 399-408.

Dawkins, R. (2006). The Selfish Gene. New York, NY: Oxford University Press. (original

work published 1976)

Distin, K. (2005). The selfish meme: A critical reassessment. New York, NY: Cambridge

University Press.

McGee, M. C. (1980). The “ideograph”: A link between rhetoric and ideology. Quarterly

Journal Of Speech, 66(1), 1-16.

Critical Pedagogy and Imperialism; social media and commodity fetishism

Gramsci has had a huge impact on critical pedagogy especially because of the importance he attached to the role of culture, in both its highbrow and popular forms, in the process of hegemony which combines rule by force with rule by consent. His discussion on the role of intellectuals in this process also infuenced discussions centering around educators as cultural workers in the critical pedagogy field. Henry Giroux has been particularly influential here. One issue which deserves greater treatment in critical pedagogy, in my view, is that of ‘powerful knowledge’ which, though not necessarily popular knowledge and also needs to be problematised, should still be mastered for one not to remain at the margins of political life.

[…]

Following Freire, I would say: the commitment to teaching is a political commitment because education is a political act. There is no such thing as a neutral educaton. We must always ask on whose side are we when we teach?  More importantly we should ask, with whom are we educating and learning? I ask this question in the spirit of Freire’s emphasis on working with rather than for  the oppressed.

In tying Marxist ideology to social media, there are a number of things to clarify, as the comparison is not a perfect one. Perhaps the most questionable caveat is the ownership of the modes of production. In the social media model, it can be said that the proletariat themselves own the modes of productions since they typically own the computer or devices that they are using to channel their intellectual labor through. Additionally, almost all popular social media networks today allow users to retain the copyright of the content that they post  (Facebook, a; MySpace, n.d.; Twitter, n.d.). Thus, it would seem that making the argument that users are alienated from the results of their intellectual labor power is a moot point.

[…]

I humbly suggest that in the social media model, owning the output or product of intellectual labor power has little if anything to do with Marx’s species being. Instead, I feel that it is the social connections created, broken, strengthened, or weakened that feed directly to the worker’s species being. Since the output of the intellectual labor power in this case is not a tangible good, the only “finished product” that the worker can place value in and not be alienated from is the actual social connection that their output generates; not the actual output itself. This allows for a supra or meta level of social connection above that of the social connections embodied in physical outputs outlined by Marx.

 

TV still sucks, we should still complain about hipsters, your job shouldn’t exist

None of this could be happening at a worse time. According to the latest S.O.S. from climate science, we have maybe 15 years to enact a radical civilizational shift before game over. This may be generous, it may be alarmist; no one knows. What is certain is that pulling off a civilizational Houdini trick will require not just switching energy tracks, but somehow confronting the “endless growth” paradigm of the Industrial Revolution that continues to be shared by everyone from Charles Koch to Paul Krugman. We face very long odds in just getting our heads fully around our situation, let alone organizing around it. But it will be impossible if we no longer even understand the dangers of chuckling along to Kia commercials while flipping between Maher, “Merlin” and “Girls.”

  • Zaitchik’s article name checks pertinent critics and theorists including Adorno’s “cultural industry,” Postman’s “Amusing Ourselves to Death,” and even Jerry Mander’s “Four Arguments for the Elimination of Television.” Where this article was discussed on sites like Reddit or Metafilter commenters seemed angry at Zaitchik, overly defensive as if they felt under attack for watching “Hannibal” and “Game of Thrones”. I thoroughly enjoyed Zaitchik’s piece, even if it doesn’t present a fully developed argument, because the perspective he presents strongly resonates with many of the philosophical foundations that have shaped my own views on media, particularly the media ecology tradition. A large part of Zaitchik’s argument is that even if television content is the highest quality it has ever been, the form of television and its effects are the same as ever:

Staring at images on a little screen — that are edited in ways that weaken the brain’s capacity for sustained and critical thought, that encourage passivity and continued viewing, that are controlled by a handful of publicly traded corporations, that have baked into them lots of extremely slick and manipulating advertising — is not the most productive or pleasurable way to spend your time, whether you’re interested in serious social change, or just want to have a calm, clear and rewarding relationship with the real world around you.

But wait, you say, you’re not just being a killjoy and a bore, you’re living in the past. Television in 2014 is not the same as television in 1984, or 1994. That’s true. Chomsky’s “propaganda model,” set out during cable’s late dawn in “Manufacturing Consent,” is due for an update. The rise of on-demand viewing and token progressive programming has complicated the picture. But only by a little. The old arguments were about structure, advertising, structure, ownership, and structure, more than they were about programming content, or what time of the day you watched it. Less has changed than remains the same. By all means, let’s revisit the old arguments. That is, if everyone isn’t busy binge-watching “House of Cards.”

It’s been something to watch, this televisionification of the left. Open a window on social media during prime time, and you’ll find young journalists talking about TV under Twitter avatars of themselves in MSNBC makeup. Fifteen years ago, these people might have attended media reform congresses discussing how corporate TV pacifies and controls people, and how those facts flow from the nature of the medium. Today, they’re more likely to status-update themselves on their favorite corporate cable channel, as if this were something to brag about.

The entertainment demands of the 21st Century seem (apparently) bottomless. We’ve outsourced much of our serotonin production to the corporations which control music, sports, television, games, movies, and books. And they’ve grown increasingly desperate to produce the most universally acceptable, exportable, franchisable, exciting, boring, money-making pablum possible. Of course that is not new either… yet it continues to worsen.

Various alternative cultures have been attempting to fight it for decades. The beats, hippies, punks, and grunge kids all tried… and eventually lost. But the hipsters have avoided it altogether by never producing anything of substance except a lifestyle based upon fetishizing obscurity and cultivating tasteful disdain. A noncommital and safe appreciation of ironic art and dead artists. No ideals, no demands, no struggle.

Rarely has the modern alternative to pop culture been so self-conscious and crippled. The mainstream has repeatedly beaten down and destroyed a half-century’s worth of attempts to keep art on a worthwhile and genuine path, but now it seems the final scion of those indie movements has adopted the: ‘if you can’t beat‘em, join‘em’ compromise of creative death.

  • In an interview for PBS, London School of Economics professor David Graeber poses the question: should your job exist?

How could you have dignity in labor if you secretly believe your job shouldn’t exist? But, of course, you’re not going to tell your boss that. So I thought, you know, there must be enormous moral and spiritual damage done to our society. And then I thought, well, maybe that explains some other things, like why is it there’s this deep, popular resentment against people who have real jobs? They can get people so angry at auto-workers, just because they make 30 bucks an hour, which is like nowhere near what corporate lawyers make, but nobody seems to resent them. They get angry at the auto-workers; they get angry at teachers. They don’t get angry at school administrators, who actually make more money. Most of the problems people blame on teachers, and I think on some level, that’s resentment: all these people with meaningless jobs are saying, but, you guys get to teach kids, you get to make cars; that’s real work. We don’t get to do real work; you want benefits, too? That’s not reasonable.

If someone had designed a work regime perfectly suited to maintaining the power of finance capital, it’s hard to see how they could have done a better job. Real, productive workers are relentlessly squeezed and exploited. The remainder are divided between a terrorised stratum of the, universally reviled, unemployed and a larger stratum who are basically paid to do nothing, in positions designed to make them identify with the perspectives and sensibilities of the ruling class (managers, administrators, etc) – and particularly its financial avatars – but, at the same time, foster a simmering resentment against anyone whose work has clear and undeniable social value. Clearly, the system was never consciously designed. It emerged from almost a century of trial and error. But it is the only explanation for why, despite our technological capacities, we are not all working 3-4 hour days.

Ernesto Laclau dies

  • Ernesto Laclau, post-Marxist critical theorist and significant figure in discourse analysis (along with his wife and collaborator Chantal Mouffe), died on April 13.

Ernesto and Chantal used the work of Antonio Gramsci to reject what they saw as the reductionism and teleology of much Marxist theory. Though sometimes calling himself a ‘post-Marxist’ and an advocate of ‘radical democracy’, Ernesto insisted that he remained a radical anti-imperialist and anti-capitalist. His criticisms of Marx and Marxism were made in a constructive spirit, and without a hint of rancour.

Ernesto was recognised as leading thinker in Latin America but also as an intellectual star in the academic world, co-authoring Contingency, Hegemony and Universality with Slavoj Žižek and Judith Butler in 2008. He gave courses at a string of leading universities in Europe and the Americas, including North Western and the New School for Social Research. Ernesto became Emeritus professor at Essex in 2003, but the Centre he established continues its work.

With collaborators including his wife, Chantal Mouffe, and the cultural theorist Stuart Hall, Laclau played a key role in reformulating Marxist theory in the light of the collapse of communism and failure of social democracy. His “post-Marxist” manifesto Hegemony and Socialist Strategy (1985), written with Mouffe, was translated into 30 languages, and sales ran into six figures. The book argued that the class conflict identified by Marx was being superseded by new forms of identity and social awareness. This worried some on the left, including Laclau’s friend Ralph Miliband, who feared that he had lost touch with the mundane reality of class division and conflict, but his criticisms of Marx and Marxism were always made in a constructive spirit.

Political populism was an enduring fascination for Laclau. His first book, Politics and Ideology in Marxist Theory (1977), offered a polite but devastating critique of the conventional discourse on Latin America at the time. This “dependency” approach tended to see the large landowners – latifundistas – as semi-feudal and pre-capitalist, while Laclau showed them to be part and parcel of Latin American capitalism which fostered enormous wealth and desperate poverty.

Witnessing the impact of the Perónist movement in Argentina led Professor Laclau to a fascination with populism. He wrote a celebrated essay on the subject in the 1970s and then a full-length book, On Populist Reason (2005), looking at the rise of leftist politicians such as Hugo Chávez across much of Latin America. Both the current president of Argentina, Cristina Fernández de Kirchner, and her late husband and predecessor Néstor Kirchner, are said to have been great admirers of his work.

Laclau’s theory of populism has played a critical role in my research. Without his theoretical insights and captivating character, I could not have expanded my initial observations of populist practices to this level.  Beside his theoretical legacy and rich intellectual input outside academia, Prof. Laclau also contributed to the training and development of students and researchers from different parts of the world – thanks to the IDA programme he founded.  His death is a great loss.

Video mélange: David Harvey, Antonio Negri, and Saints Row IV

Ender’s Game analyzed, the Stanley Parable explored, Political Economy of zombies, semiotics of Twitter, much more

It’s been a long time since the last update (what happened to October?), so this post is extra long in an attempt to catch up.

  • I haven’t seen the new Ender’s Game movie, but this review by abbeyotis at Cyborgology calls the film “a lean and contemporary plunge into questions of morality mediated by technology”:

In a world in which interplanetary conflicts play out on screens, the government needs commanders who will never shrug off their campaigns as merely “virtual.” These same commanders must feel the stakes of their simulated battles to be as high as actual warfare (because, of course, they are). Card’s book makes the nostalgic claim that children are useful because they are innocent. Hood’s movie leaves nostalgia by the roadside, making the more complex assertion that they are useful because of their unique socialization to be intimately involved with, rather than detached from, simulations.

  • In the ongoing discourse about games criticism and its relation to film reviews, Bob Chipman’s latest Big Picture post uses his own review of the Ender’s Game film as an entry point for a breathless treatise on criticism. The video presents a concise and nuanced overview of arts criticism, from the classical era through film reviews as consumer reports up to the very much in-flux conceptions of games criticism.  Personally I find this video sub-genre (where spoken content is crammed into a Tommy gun barrage of word bullets so that the narrator can convey a lot of information in a short running time) irritating and mostly worthless, since the verbal information is being presented faster than the listener can really process it. It reminds me of Film Crit Hulk, someone who writes excellent essays with obvious insight into filmmaking, but whose aesthetic choice (or “gimmick”) to write in all caps is often a distraction from the content and a deterrent to readers. Film Crit Hulk has of course addressed this issue and explained the rationale for this choice, but considering that his more recent articles have dropped the third-person “Hulk speak”  writing style the all caps seems to be played out. Nevertheless, I’m sharing the video because Mr. Chipman makes a lot of interesting points, particularly regarding the cultural contexts for the various forms of criticism. Just remember to breathe deeply and monitor your heart rate while watching.
  • In this video of a presentation titled Game design: the medium is the message, Jonathan Blow discusses how commercial constraints dictate the form of products from TV shows to video games.
  • This video from Satchbag’s Goods is ostensibly a review of Hotline Miami, but develops into a discussion of art movements and Kanye West:
  • This short interview with Slavoj Žižek in New York magazine continues a trend I’ve noticed since Pervert’s Guide to Ideology has been releasing, wherein writers interviewing Žižek feel compelled to include themselves and their reactions to/interactions with Žižek into their article. Something about a Žižek encounter brings out the gonzo in journalists. The NY mag piece is also notable for this succinct positioning of Žižek’s contribution to critical theory:

Žižek, after all, the ­Yugoslav-born, Ljubljana-based academic and Hegelian; mascot of the Occupy movement, critic of the Occupy movement; and former Slovenian presidential candidate, whose most infamous contribution to intellectual history remains his redefinition of ideology from a Marxist false consciousness to a Freudian-Lacanian projection of the unconscious. Translation: To Žižek, all politics—from communist to social-democratic—are formed not by deliberate principles of freedom, or equality, but by expressions of repressed desires—shame, guilt, sexual insecurity. We’re convinced we’re drawing conclusions from an interpretable world when we’re actually just suffering involuntary psychic fantasies.

Following the development of the environment on the team’s blog you can see some of the gaps between what data was deemed noteworthy or worth recording in the seventeenth century and the level of detail we now expect in maps and other infographics. For example, the team struggled to pinpoint the exact location on Pudding Lane of the bakery where the Great Fire of London is thought to have originated and so just ended up placing it halfway along.

  • Stephen Totilo reviewed the new pirate-themed Assassin’s Creed game for the New York Times. I haven’t played the game, but I love that the sections of the game set in the present day have shifted from the standard global conspiracy tropes seen in the earlier installments to postmodern self-referential and meta-fictional framing:

Curiously, a new character is emerging in the series: Ubisoft itself, presented mostly in the form of self-parody in the guise of a fictional video game company, Abstergo Entertainment. We can play small sections as a developer in Abstergo’s Montreal headquarters. Our job is to help turn Kenway’s life — mined through DNA-sniffing gadgetry — into a mass-market video game adventure. We can also read management’s emails. The team debates whether games of this type could sell well if they focused more on peaceful, uplifting moments of humanity. Conflict is needed, someone argues. Violence sells.

It turns out that Abstergo is also a front for the villainous Templars, who search for history’s secrets when not creating entertainment to numb the population. In these sections, Ubisoft almost too cheekily aligns itself with the bad guys and justifies its inevitable 2015 Assassin’s Creed, set during yet another violent moment in world history.

The Stanley Parable wants you to think about it. The Stanley Parable, despite its very limited inputs (you can’t even jump, and very few objects are interactive) looks at those parts of first-person gaming that are least easy to design for – exploration and messing with the game’s engine – and foregrounds them. It takes the very limitations of traditional gaming narratives and uses them to ruthlessly expose their own flaws.

Roy’s research focus prior to founding Bluefin, and continued interest while running the company, has to do with how both artificial and human intelligences learn language. In studying this process, he determined that the most important factor in meaning making was the interaction between human beings: non one learns language in a vacuum, after all. That lesson helped inform his work at Twitter, which started with mapping the connection between social network activity and live broadcast television.

Aspiring to cinematic qualities is not bad in an of itself, nor do I mean to shame fellow game writers, but developers and their attendant press tend to be myopic in their point of view, both figuratively and literally. If we continually view videogames through a monocular lens, we miss much of their potential. And moreover, we begin to use ‘cinematic’ reflexively without taking the time to explain what the hell that word means.

Metaphor is a powerful tool. Thinking videogames through other media can reframe our expectations of what games can do, challenge our design habits, and reconfigure our critical vocabularies. To crib a quote from Andy Warhol, we get ‘a new idea, a new look, a new sex, a new pair of underwear.’ And as I hinted before, it turns out that fashion and videogames have some uncanny similarities.

Zombies started their life in the Hollywood of the 1930s and ‘40s as simplistic stand-ins for racist xenophobia. Post-millennial zombies have been hot-rodded by Danny Boyle and made into a subversive form of utopia. That grim utopianism was globalized by Max Brooks, and now Brad Pitt and his partners are working to transform it into a global franchise. But if zombies are to stay relevant, it will rely on the shambling monsters’ ability to stay subversive – and real subversive shocks and terror are not dystopian. They are utopian.

Ironically, our bodies now must make physical contact with devices dictating access to the real; Apple’s Touch ID sensor can discern for the most part if we are actually alive. This way, we don’t end up trying to find our stolen fingers on the black market, or prevent others from 3D scanning them to gain access to our lives.

This is a monumental shift from when Apple released its first iPhone just six years ago. It’s a touchy subject: fingerprinting authentication means we confer our trust in an inanimate object to manage our animate selves – our biology is verified, digitised, encrypted, as they are handed over to our devices.

Can you really buy heroin on the Web as easily as you might purchase the latest best-seller from Amazon? Not exactly, but as the FBI explained in its complaint, it wasn’t exactly rocket science, thanks to Tor and some bitcoins. Here’s a rundown of how Silk Road worked before the feds swooped in.

  • Henry Jenkins posted the transcript of an interview with Mark J.P. Wolf. The theme of the discussion is “imaginary worlds,” and they touch upon the narratology vs. ludology conflict in gaming:

The interactivity vs. storytelling debate is really a question of the author saying either “You choose” (interaction) or “I choose” (storytelling) regarding the events experienced; it can be all of one or all of the other, or some of each to varying degrees; and even when the author says “You choose”, you are still choosing from a set of options chosen by the author.  So it’s not just a question of how many choices you make, but how many options there are per choice.  Immersion, however, is a different issue, I think, which does not always rely on choice (such as immersive novels), unless you want to count “Continue reading” and “Stop reading” as two options you are constantly asked to choose between.

Epic EVE battle, Critical games criticism, indie developer self-publishing

  • I’ve never played EVE Online, and I don’t even really understand how it works, but I find it fascinating. Last week saw the biggest battle in the game’s history. This breakdown from The Verge is headlined like a real-life dispatch from the frontier of mankind’s space-faring endeavors: Largest space battle in history claims 2,900 ships, untold virtual lives

Update, 9:18PM ET: The battle is over. After more than five hours of combat, the CFC has defeated TEST Alliance. Over 2,900 ships were destroyed today in the largest fleet battle in Eve Online’s history. TEST Alliance intended to make a definitive statement in 6VDT, but their defeat at the hands of the CFC was decisive and will likely result in TEST’s withdrawal from the Fountain region.

In a conversation with Whitten, he told us that the commitment to independent developers is full. There won’t be restrictions on the type of titles that can be created, nor will there be limits in scope. In response to a question on whether retail-scale games could be published independently, Whitten told us, “Our goal is to give them access to the power of Xbox One, the power of Xbox Live, the cloud, Kinect, Smartglass. That’s what we think will actually generate a bunch of creativity on the system.” With regard to revenue splitting with developers, we were told that more information will be coming at Gamescom, but that we could think about it “generally like we think about Marketplace today.” According to developers we’ve spoken with, that split can be approximately 50-50.

Another difference between the Xbox One and Xbox 360 is how the games will be published and bought by other gamers. Indie games will not be relegated to the Xbox Live Indie Marketplace like on the Xbox 360 or required to have a Microsoft-certified publisher to distribute physically or digitally outside the Indie Marketplace. All games will be featured in one big area with access to all kinds of games.

If anything has hurt modern video game design over the past several years, it has been the rise of ‘freemium‘. It seems that it is rare to see a top app or game in the app stores that has a business model that is something other than the ‘free-to-play with in-app purchases’ model. It has been used as an excuse to make lazy, poorly designed games that are predicated on taking advantage of psychological triggers in its players, and will have negative long term consequences for the video game industry if kept unchecked.

Many freemium games are designed around the idea of conditioning players to become addicted to playing the game. Many game designers want their games to be heavily played, but in this case the freemium games are designed to trigger a ‘reward’ state in the player’s brain in order to keep the player playing (and ultimately entice the user to make in-app purchases to continue playing). This type of conditioning is often referred to as a ‘Skinner box‘, named after the psychologist that created laboratory boxes used to perform behavioral experiments on animals.

It obviously isn’t beyond the realm of possibility that, not only do financial considerations influence a game’s structure and content, financial outcomes affect a studio’s likelihood of survival in the industry, based upon the machinations of its publishing overlords. Activision killed Bizarre Creations, Eidos ruined Looking Glass Studios, EA crushed Westood, Pandemic, Bullfrog, Origin Systems… well, the list could go on, until I turn a strange, purple color, but you get my point. And, when 3.4 million copies sold for a Tomb Raider reboot isn’t enough by a publisher’s standards, you can’t help but feel concern for a developer’s future.

This relationship between environment-learner-content interaction and transfer puts teachers in the unique position to capitalize on game engagement to promote reflection that positively shapes how students tackle real-world challenges. To some, this may seem like a shocking concept, but it’s definitely not a new one—roleplay as instruction, for example, was very popular among the ancient Greeks and, in many ways, served as the backbone for Plato’s renowned Allegory of the Cave. The same is true of Shakespeare’s works, 18th and 19th century opera, and many of the novels, movies, and other media that define our culture. More recently, NASA has applied game-like simulations to teach astronauts how to maneuver through space, medical schools have used them to teach robotic surgery, and the Federal Aviation Administration has employed them to test pilots.

The relationship between the creator, the product, and the audience, are all important contexts to consider during media analysis, especially with games. This is because the audience is an active participant in the media. So if you are creating a game you always have to keep in mind the audience. Even if you say the audience doesn’t matter to you, it won’t cease to exist, and it does not erase the impact your game will have.

Similarly, if you are critiquing or analyzing any media, you can’t ignore the creator and the creator’s intentions. Despite those who claim the “death of the author,” if the audience is aware of the creator’s intentions, it can affect how they perceive the game. Particularly, if you consider the ease in which creators can release statements talking about their work, you’ll have an audience with varying levels of awareness about the creator’s intentions. These factors all play off of each other–they do not exist in a vacuum.

When we talk about any medium’s legitimacy, be it film or videogames or painting, it’s a very historical phenomenon that is inextricably tied to its artness that allows for them to get in on the ground floor of “legitimate” and “important.” So if we contextualize the qualities that allowed for film or photography to find themselves supported through a panoply of cultural institutions it was a cultural and political economic process that lead them there.

[…]

Videogames, the kind that would be written about in 20 dollar glossy art magazines, would be exactly this. When creators of videogames want to point to their medium’s legitimacy, it would help to have a lot of smart people legitimate your work in a medium (glossy magazines, international newspapers) that you consider to be likewise legitimate. Spector concedes that ‘yes all the critics right now are online’, but the real battle is in getting these critics offline and into more “legitimate” spaces of representation. It’s a kind of unspoken hierarchy of mediums that is dancing before us here: at each step a new gatekeeper steps into play, both legitimating and separating the reader from the critic and the object of criticism.

All three games define fatherhood around the act of protection, primarily physical protection. And in each of these games, the protagonist fails—at least temporarily—to protect their ward. In Ethan’s case, his cheery family reflected in his pristine home collapses when he loses a son in a car accident. Later, when his other son goes missing, the game essentially tests Ethan’s ability to reclaim his protective-father status.

No video game grants absolute freedom; they all have rules or guidelines that govern what you can and can’t do. The sci-fi epic Mass Effect is a series that prides itself on choice, but even that trilogy ends on a variation of choosing between the “good” and “bad” ending. Minecraft, the open-world creation game, is extremely open-ended, but you can’t build a gun or construct a tower into space because it doesn’t let you. BioShock’s ending argues that the choices you think you’re making in these games don’t actually represent freedom. You’re just operating within the parameters set by the people in control, be they the developers or the guy in the game telling you to bash his skull with a golf club.

BioShock’s disappointing conclusion ends up illustrating Ryan’s point. A man chooses, a player obeys. It’s a grim and cynical message that emphasizes the constraints of its own art form. And given that the idea of choice is so important to BioShock’s story, I don’t think it could’ve ended any other way.

The Ideology of Scarface, Community as PoMo masterpiece, Present Shock reviewed, etc.

In The Godfather, the blurring of the line between crime and the “legitimate” economy can still seem shocking. In Scarface, the distinction seems quaintly naïve. In The Godfather, Don Vito almost loses everything over his refusal to deal in heroin. In Scarface, Tony Montana knows that coke is just another commodity in a boom economy. Michael Corleone marries the wispy, drooping Kate Adams to give his enterprise some old-fashioned, WASP class. When Tony Montana takes possession of the coked-up bombshell called Elvira Hancock, he is filling his waterbed with cash, not class. Even more excruciatingly, Scarface tells us these truths without any self-righteousness, without the consoling promise that manly discipline can save America from its fate. In the moral economy of this movie, the terms of critique have become indistinguishable from the terms of affirmation. “You know what capitalism is?” Tony answers his own question: “Getting fucked.”

Donovan put Neumann in charge of the Research and Analysis Branch of the OSS, studying Nazi-ruled central Europe. Neumann was soon joined by the philosopher Herbert Marcuse and the legal scholar Otto Kirchheimer, his colleagues at the left-wing Institute for Social Research, which had been founded in Frankfurt in 1923 but had moved to Columbia University after the Nazis came to power.

An update of the promise, that the media could create a different, even a better world, seems laughable from our perspective of experience with the technologically based democracies of markets. As a utopia-ersatz, this promise appears to be obsolete in the former hegemonial regions of North America and western and northern Europe. Now that it is possible to create a state with media, they are no longer any good for a revolution. The media are an indispensable component of functioning social hierarchies, both from the top down and the bottom up, of power and countervailing power. They have taken on systemic character. Without them, nothing works anymore in what the still surviving color supplements in a careless generalization continue to call a society. Media are an integral part of the the everyday coercive context, which is termed “practical constraints.” As cultural techniques, which need to be learned for social fitness, they are at the greatest possible remove from what whips us into a state of excitement, induces aesthetic exultation, or triggers irritated thoughts.

[…]

At the same time, many universities have established courses in media design, media studies, and media management. Something that operates as a complex, dynamic, and edgy complex between the discourses, that is, something which can only operate interdiscursively, has acquired a firm and fixed place in the academic landscape. This is reassuring and creates professorial chairs, upon which a once anarchic element can be sat out and developed into knowledge for domination and control. Colleges and academies founded specifically for the media proactively seek close relationships with the industries, manufacturers, and the professional trades associations of design, orientation, and communication.

There are five ways Rushkoff thinks present shock is being experienced and responded to. To begin, we are in an era in which he thinks narrative has collapsed. For as long as we have had the power of speech we have corralled time into linear stories with a beginning, middle and ending. More often than not these stories contained some lesson. They were not merely forms of entertainment or launching points for reflection but contained some guidance as to how we should act in a given circumstance, which, of course, differed by culture, but almost all stories were in effect small oversimplified models of real life.

[…]

The medium Rushkoff thinks is best adapted to the decline of narrative are video games. Yes, they are more often than not violent, but they also seem tailor made for the kinds of autonomy and collaborative play that are the positive manifestations of our new presentism.