Tagged: zizek

Chris Hedges interviews Chomsky; Žižek on illusion of freedom; Bogost on ‘Darmok’ and ‘Yo’

AP/Nader Daoud

AP/Nader Daoud

Chomsky believes that the propaganda used to manufacture consent, even in the age of digital media, is losing its effectiveness as our reality bears less and less resemblance to the portrayal of reality by the organs of mass media. While state propaganda can still “drive the population into terror and fear and war hysteria, as we saw before the invasion of Iraq,” it is failing to maintain an unquestioned faith in the systems of power. Chomsky credits the Occupy movement, which he describes as a tactic, with “lighting a spark” and, most important, “breaking through the atomization of society.”

“There are all sorts of efforts to separate people from one another,” he said. “The ideal social unit [in the world of state propagandists] is you and your television screen. The Occupy actions brought that down for a large part of the population. People recognized that we could get together and do things for ourselves. We can have a common kitchen. We can have a place for public discourse. We can form our ideas. We can do something. This is an important attack on the core of the means by which the public is controlled. You are not just an individual trying to maximize consumption. You find there are other concerns in life. If those attitudes and associations can be sustained and move in new directions, that will be important.”

Not only have we learned a lot about the illegal activities of the US and other great powers. Not only have the WikiLeaks revelations put secret services on the defensive and set in motion legislative acts to better control them. WikiLeaks has achieved much more: millions of ordinary people have become aware of the society in which they live. Something that until now we silently tolerated as unproblematic is rendered problematic.

This is why Assange has been accused of causing so much harm. Yet there is no violence in what WikiLeaks is doing. We all know the classic scene from cartoons: the character reaches a precipice but goes on running, ignoring the fact that there is no ground underfoot; they start to fall only when they look down and notice the abyss. What WikiLeaks is doing is just reminding those in power to look down.

The reaction of all too many people, brainwashed by the media, to WikiLeaks’ revelations could best be summed up by the memorable lines of the final song from Altman’s film Nashville: “You may say I ain’t free but it don’t worry me.” WikiLeaks does make us worry. And, unfortunately, many people don’t like that.

“Darmok” gives us one vision of a future in which procedural rhetoric takes precedence over verbal and visual rhetoric, indeed in which the logic of logics subsume the logics of description, appearances, and even of narrative—that preeminent form that even Troi mistakes as paramount to the Children of Tama. The Tamarian’s media ecosystem is the opposite of ours, one in which behaviors are taken as primary, and descriptions as secondary, almost incidental. The Children of Tama are less interesting as aliens than they are as counterfactual versions of us, if we preferred logic over image or description.

At the end of “Darmok,” Riker finds Captain Picard sitting in his ready room, reading from an ancient book rather than off a tablet. “Greek, sir?” Riker asks. “The Homeric Hymns,” Picard responds, “One of the root metaphors of our own culture. “For the next time we encounter the Tamarians…” suggests the first officer. To which his captain replies, “More familiarity with our own mythology might help us relate to theirs.” A charming sentiment, and a move that always works for Star Trek—the juxtaposition of classical antiquity and science-fictional futurism. But Picard gets it wrong one last time. To represent the world as systems of interdependent logics we need not elevate those logics to the level of myth, nor focus on the logics of our myths. Instead, we would have to meditate on the logics in everything, to see the world as one built of weird, rusty machines whose gears squeal as they grind against one another, rather than as stories into which we might write ourselves as possible characters.

It’s stupid. There’s no other word for it. But according to TechCrunch, 50,000 people have sent 4 million Yos since the app was launched on, uhm, April Fool’s Day of this year. But sometimes in stupidity we find a kind of frankness, an honesty. For his part, Arbel has rather overstated the matter. “We like to call it context-based messaging,” he told The New York Times. “You understand by the context what is being said.”

[…]

Perhaps the problem with Yo isn’t what makes it stupid—its attempt to formalize the meta-communication common to online life—but what makes it gross: the need to contain all human activity within the logics of tech startups. The need to expect something from every idea, even the stupid ones, to feel that they deserve attention, users, data, and, inevitably, payout. Perhaps this is the greatest meta-communicative message of today’s technology scene. And it might not be inaccurate to summarize that message with a singular, guttural “yo.”

A ticklish subject: Decrying, defending Žižek as teacher

Wikimedia Commons

Wikimedia Commons

  • Slavoj Žižek’s pedagogy became a topic of debate among critics and supporters of the philosopher after video of an interview with Žižek was posted to YouTube. In the 10-minute video, recorded in April at the 2014 Žižek Conference in Cincinnati, Žižek discusses his loathing of office hours, among other subjects. Regarding classes he has taught in the U.S., Žižek recalls telling students “If you don’t give me any of your shitty papers, you get an A”. Here is the video on YouTube, or you can watch the embed below:

I even told students at the New School for example… if you don’t give me any of your shitty papers, you get an A. If you give me a paper I may read it and not like it and you can get a lower grade.

  • And regarding office hours:

I can’t imagine a worse experience than some idiot comes there and starts to ask you questions, which is still tolerable. The problem is that here in the United States students tend to be so open that sooner or later, if you’re kind to them, they even start to ask you personal questions [about] private problems… What should I tell them?

Zizek has always been vocal about his general disdain for students and humanity writ large. He once admitted in 2008 that seeing stupid people happy makes him depressed, before describing teaching as the worst job he has ever had.

[…]

On a personal note, I was once told at the New School by a senior faculty member that Zizek would fill up his sign-up sheet for office hours with fake names to avoid student contact. I still wonder if that story is true, but now it doesn’t seem so out of character.

I have no idea what a superstar like Žižek gets paid, and I don’t know if he actually fills his office-hours sign-up sheet with fake names so that none of the “boring idiots” come and bother him with their stupid problems, as one New School faculty member has apparently claimed. But I feel safe in guessing that he earns more to not-grade one “shitty paper” than many professors do in a semester.

The real problem with Žižek, in any case, isn’t that he feels this way or that he says these things aloud. It’s that he does so and people think it’s hilarious. It’s that his view is, believe it or not, a common “superstar” view of students—so common, in fact, that if you work at a research university and actually like teaching, you should maybe pretend you don’t, lest you appear not “serious” enough about your research.

[…]

The academy is in crisis. The humanities’ relevance is questioned obnoxiously on a near-daily basis. Humanists need to think carefully about who our heroes are, and who should represent our disciplines to the public. Maybe, just maybe, this Ži-jerk has finally proved himself unsuited to the task.

I’m sure all of us have stories of colleagues basically slandering their students, and there is no more common complaint in the academic world than about the tedium of grading. I would venture to say that much of the resentment of Zizek’s attitudes stems from an unacknowledged desire to do exactly the things they’re castigating Zizek for. Wouldn’t it be awesome to be able to tell the students what I really think of them? Wouldn’t it be great not to have to deal with their crappy writing? Wouldn’t it be amazing to finally take the university at its word, valuing research absolutely and exclusively while making at best a token gesture toward teaching?

Indeed, it was disdain for teaching that made it so tempting to outsource pedagogical labor to grad students and underpaid adjuncts so that real professors could have the space to do real academic work. Zizek’s opinions aren’t some crazy outlier, they’re the structuring principles of our system of academic labor.

That I have met Zizek personally and can attest that he is no jerk is a minor point. That I personally witnessed him reject dinner with established professors and instead choose to sit with undergraduate students at a University of Rochester event is also fairly trivial but instructive about his actual attitude towards students.

[…]

No one has been more outspoken, or effective, about combating this “crisis” [in academia] than Zizek. He is dogmatic in his steadfast criticism of the Bologna reforms in Europe. Rejecting globalization’s call for experts instead of critically-thinking humanists cannot be accomplished through office hours and friendly teaching styles.

The risk of losing liberal arts is indelibly linked to the intrusion of unfettered (ostensibly) “market” mechanisms throughout human life. Where there used to be at least some sanctuary, now there is none. Education is just one of the last to fall.

Mrs. Schuman also uses the basic strategy that is usually employed by politically right-wing authors who try to dismiss Žižek’s political engagements, more specifically taking one of his jokes out of context. I have followed some comments of various professors online about this specific Žižek’s statement of ‘not reading and examining student papers’ and most of them have shared some sympathy with Žižek’s joke, agreeing that doing such work actually consumes a huge amount of their time for which they don’t receive much gratification. Mrs. Schuman fails to notice that Žižek is employed as senior researcher and not as a regular professor, at least at his faculty in Ljubljana and to attack him for not grading papers there is simply absurd, since it’s a formal post and it’s quite rare that he appears to deliver a lecture there at all. She also, like most of the people bashing him, fails to notice the statement was a joke, like many of phrases like this mainly produced in recorded interviews to provoke a media response, so in a way Mrs. Schuman’s text could have said to have been expected before it was ever written.

But anyone who is familiar with how he develops theory should notice that he is also the International Director of the Institute for the Humanities at Birbkeck in London, where he annually holds very serious ‘masterclasses’, which consist of multiple successive days of lectures followed by discussions with his students, where there is more than enough opportunity for those who are genuinely interested in his work to provide their comments and criticism, and more importantly, get a first-person perspective and a chance to collaborate on the development of his theory; his lectures there have often ended up as important parts of his big philosophical tomes later on. So he does teach classes, very important classes which have philosophical consequences, and Mrs. Schuman repeats the accusation against him simply because she doesn’t seem to be very familiar with his work. Those that are obsessed with Žižek’s place in the employment scheme of academia usually harbour resentment and envy due to their personal lack of luck at getting a satisfying job in the academic machinery and are just searching for quick attempts at dismissal.

Chomsky on Snowden, Žižek on Buddhism, Fuchs on social media and the public sphere

These exposures lead us to inquire into state policy more generally and the factors that drive it. The received standard version is that the primary goal of policy is security and defense against enemies.

The doctrine at once suggests a few questions: security for whom, and defense against which enemies? The answers are highlighted dramatically by the Snowden revelations.

Policy must assure the security of state authority and concentrations of domestic power, defending them from a frightening enemy: the domestic population, which can become a great danger if not controlled.

Social media has become a key term in Media and Communication Studies and public discourse for characterising platforms such as Facebook, Twitter, YouTube, Wikipedia, LinkedIn, WordPress, Blogspot, Weibo, Pinterest, Foursquare and Tumblr. This lecture discusses the role of the concept of the public sphere for understanding social media critically. It argues against an idealistic interpretation of Habermas and for a cultural-materialist understanding of the public sphere concept that is grounded in political economy. It sets out that Habermas’ original notion should best be understood as a method of immanent critique that critically scrutinises limits of the media and culture grounded in power relations and political economy. It introduces a theoretical model of public service media that it uses as foundation for identifying three antagonisms of the contemporary social media sphere in the realms of the economy, the state and civil society. It concludes that these limits can only be overcome if the colonisation of the social media lifeworld is countered politically so that social media and the Internet become public service and commons-based media.

Žižek on post-U.S. order, Harvey on Piketty, Rushkoff’s new job and doc

The “American century” is over, and we have entered a period in which multiple centres of global capitalism have been forming. In the US, Europe, China and maybe Latin America, too, capitalist systems have developed with specific twists: the US stands for neoliberal capitalism, Europe for what remains of the welfare state, China for authoritarian capitalism, Latin America for populist capitalism. After the attempt by the US to impose itself as the sole superpower – the universal policeman – failed, there is now the need to establish the rules of interaction between these local centres as regards their conflicting interests.

In politics, age-old fixations, and particular, substantial ethnic, religious and cultural identities, have returned with a vengeance. Our predicament today is defined by this tension: the global free circulation of commodities is accompanied by growing separations in the social sphere. Since the fall of the Berlin Wall and the rise of the global market, new walls have begun emerging everywhere, separating peoples and their cultures. Perhaps the very survival of humanity depends on resolving this tension.

  • Thomas Piketty’s book Capital in the 21st Century has received widespread media attention, and enjoyed so much popular success that at times Amazon has been sold out of copies. It seems natural then that David Harvey, reigning champion of Marx’s Capital in the 21st century would comment on the work, which he has now done on his web site:

The book has often been presented as a twenty-first century substitute for Karl Marx’s nineteenth century work of the same title. Piketty actually denies this was his intention, which is just as well since his is not a book about capital at all. It does not tell us why the crash of 2008 occurred and why it is taking so long for so many people to get out from under the dual burdens of prolonged unemployment and millions of houses lost to foreclosure. It does not help us understand why growth is currently so sluggish in the US as opposed to China and why Europe is locked down in a politics of austerity and an economy of stagnation. What Piketty does show statistically (and we should be indebted to him and his colleagues for this) is that capital has tended throughout its history to produce ever-greater levels of inequality. This is, for many of us, hardly news. It was, moreover, exactly Marx’s theoretical conclusion in Volume One of his version of Capital. Piketty fails to note this, which is not surprising since he has since claimed, in the face of accusations in the right wing press that he is a Marxist in disguise, not to have read Marx’s Capital.

[…]

There is, however, a central difficulty with Piketty’s argument. It rests on a mistaken definition of capital. Capital is a process not a thing. It is a process of circulation in which money is used to make more money often, but not exclusively through the exploitation of labor power.

  • At the 2012 Media Ecology conference in Manhattan I heard Douglas Rushkoff explain that he had stopped teaching classes at NYU because the department was not letting him teach a sufficient number of hours, all while using his likeness on program brochures. Well, Rushkoff has just been appointed to his first full-time academic post. Media Bistro reported CUNY’s announcement :

Beginning this fall at CUNY’s Queens College, students can work their way towards an MA in Media Studies. Set to mold the curriculum is an expert responsible for terms such as “viral media” and “social currency.”

  • Lastly, this news made me realize that I completely missed Rushkoff’s new Frontline special that premiered in February: Generation Like, which is available on the Frontline web site.

Mike Gane interview: Baudrillard, academia, more

  • The upcoming issue of the International Journal of Baudrillard Studies features an interview with Baudrillard scholar Mike Gane. The interview touches upon a variety of topics, including Gane’s interactions with Baudrillard, media coverage of Margaret Thatcher’s death, and hypothesizing what Baudrillard would be writing about were he alive today:

One could ‘see’ the specific things Baudrillard would have picked up – extreme phenomena like sovereign debt. Today he would be writing on fracking, drones, etc.

Gane also addresses the present state of academia:

The essential point is that the whole educational experience has changed, and the student has become oriented to enterprise, and to developing, accumulating, human capital. The student gets used to appraising the lecturer’s performance just as the lecturer grades the student, and the Sunday Times grades the university. So, all the discussion about declining standards focuses on the wrong issue.  What has happened is a transformation of individualism, not towards a new freedom in the classical liberal sense, but towards a new individual who builds up capital and exploits this competitively. The university staff members are equally thrown into a competitive game network, where to outperform others is essential to survival. Almost everything is assessed and ranked with a degree of Kafkaesque bureaucratisation that is hardly believable. Whereas the system of 40 years ago was simple and relaxed, with liberal values, and within it there were known traditional hierarchies, today it is hyper-bureaucratised and hyper-legalised and the hierarchies have changed and keep changing.   Thus to understand what has happened it is essential to see that neoliberalism does not diminish the action of the state; it avoids direct state intervention but only to insert new mechanisms and values insidiously where none existed before: for example, in Britain it is only now, forty years after the initial entry of neoliberalism, that an enterprise element is being required on each degree course, and that an enterprise element is to be counted within the work profile of academics. And these new mechanisms do not stand still; the system is in constant movement, as if in permanent crisis. This why Baudrillard, and others like Žižek, have described this as a new totalitarianism which works not by imposing a system of commands but rather a game framework into which the individual is absorbed and has to adapt at a moment’s notice.

  • In a recent Atlantic article Ian Bogost considered the McRib sandwich through the lens of Lacanian psychoanalysis. The aphoristic ending of the essay recalls the Baudrillardian turn on the function of Disneyland and prisons:

Yet, the McRib’s perversity is not a defect, but a feature. The purpose of the McRib is to make the McNugget seem normal.

Ender’s Game analyzed, the Stanley Parable explored, Political Economy of zombies, semiotics of Twitter, much more

It’s been a long time since the last update (what happened to October?), so this post is extra long in an attempt to catch up.

  • I haven’t seen the new Ender’s Game movie, but this review by abbeyotis at Cyborgology calls the film “a lean and contemporary plunge into questions of morality mediated by technology”:

In a world in which interplanetary conflicts play out on screens, the government needs commanders who will never shrug off their campaigns as merely “virtual.” These same commanders must feel the stakes of their simulated battles to be as high as actual warfare (because, of course, they are). Card’s book makes the nostalgic claim that children are useful because they are innocent. Hood’s movie leaves nostalgia by the roadside, making the more complex assertion that they are useful because of their unique socialization to be intimately involved with, rather than detached from, simulations.

  • In the ongoing discourse about games criticism and its relation to film reviews, Bob Chipman’s latest Big Picture post uses his own review of the Ender’s Game film as an entry point for a breathless treatise on criticism. The video presents a concise and nuanced overview of arts criticism, from the classical era through film reviews as consumer reports up to the very much in-flux conceptions of games criticism.  Personally I find this video sub-genre (where spoken content is crammed into a Tommy gun barrage of word bullets so that the narrator can convey a lot of information in a short running time) irritating and mostly worthless, since the verbal information is being presented faster than the listener can really process it. It reminds me of Film Crit Hulk, someone who writes excellent essays with obvious insight into filmmaking, but whose aesthetic choice (or “gimmick”) to write in all caps is often a distraction from the content and a deterrent to readers. Film Crit Hulk has of course addressed this issue and explained the rationale for this choice, but considering that his more recent articles have dropped the third-person “Hulk speak”  writing style the all caps seems to be played out. Nevertheless, I’m sharing the video because Mr. Chipman makes a lot of interesting points, particularly regarding the cultural contexts for the various forms of criticism. Just remember to breathe deeply and monitor your heart rate while watching.
  • In this video of a presentation titled Game design: the medium is the message, Jonathan Blow discusses how commercial constraints dictate the form of products from TV shows to video games.
  • This video from Satchbag’s Goods is ostensibly a review of Hotline Miami, but develops into a discussion of art movements and Kanye West:
  • This short interview with Slavoj Žižek in New York magazine continues a trend I’ve noticed since Pervert’s Guide to Ideology has been releasing, wherein writers interviewing Žižek feel compelled to include themselves and their reactions to/interactions with Žižek into their article. Something about a Žižek encounter brings out the gonzo in journalists. The NY mag piece is also notable for this succinct positioning of Žižek’s contribution to critical theory:

Žižek, after all, the ­Yugoslav-born, Ljubljana-based academic and Hegelian; mascot of the Occupy movement, critic of the Occupy movement; and former Slovenian presidential candidate, whose most infamous contribution to intellectual history remains his redefinition of ideology from a Marxist false consciousness to a Freudian-Lacanian projection of the unconscious. Translation: To Žižek, all politics—from communist to social-democratic—are formed not by deliberate principles of freedom, or equality, but by expressions of repressed desires—shame, guilt, sexual insecurity. We’re convinced we’re drawing conclusions from an interpretable world when we’re actually just suffering involuntary psychic fantasies.

Following the development of the environment on the team’s blog you can see some of the gaps between what data was deemed noteworthy or worth recording in the seventeenth century and the level of detail we now expect in maps and other infographics. For example, the team struggled to pinpoint the exact location on Pudding Lane of the bakery where the Great Fire of London is thought to have originated and so just ended up placing it halfway along.

  • Stephen Totilo reviewed the new pirate-themed Assassin’s Creed game for the New York Times. I haven’t played the game, but I love that the sections of the game set in the present day have shifted from the standard global conspiracy tropes seen in the earlier installments to postmodern self-referential and meta-fictional framing:

Curiously, a new character is emerging in the series: Ubisoft itself, presented mostly in the form of self-parody in the guise of a fictional video game company, Abstergo Entertainment. We can play small sections as a developer in Abstergo’s Montreal headquarters. Our job is to help turn Kenway’s life — mined through DNA-sniffing gadgetry — into a mass-market video game adventure. We can also read management’s emails. The team debates whether games of this type could sell well if they focused more on peaceful, uplifting moments of humanity. Conflict is needed, someone argues. Violence sells.

It turns out that Abstergo is also a front for the villainous Templars, who search for history’s secrets when not creating entertainment to numb the population. In these sections, Ubisoft almost too cheekily aligns itself with the bad guys and justifies its inevitable 2015 Assassin’s Creed, set during yet another violent moment in world history.

The Stanley Parable wants you to think about it. The Stanley Parable, despite its very limited inputs (you can’t even jump, and very few objects are interactive) looks at those parts of first-person gaming that are least easy to design for – exploration and messing with the game’s engine – and foregrounds them. It takes the very limitations of traditional gaming narratives and uses them to ruthlessly expose their own flaws.

Roy’s research focus prior to founding Bluefin, and continued interest while running the company, has to do with how both artificial and human intelligences learn language. In studying this process, he determined that the most important factor in meaning making was the interaction between human beings: non one learns language in a vacuum, after all. That lesson helped inform his work at Twitter, which started with mapping the connection between social network activity and live broadcast television.

Aspiring to cinematic qualities is not bad in an of itself, nor do I mean to shame fellow game writers, but developers and their attendant press tend to be myopic in their point of view, both figuratively and literally. If we continually view videogames through a monocular lens, we miss much of their potential. And moreover, we begin to use ‘cinematic’ reflexively without taking the time to explain what the hell that word means.

Metaphor is a powerful tool. Thinking videogames through other media can reframe our expectations of what games can do, challenge our design habits, and reconfigure our critical vocabularies. To crib a quote from Andy Warhol, we get ‘a new idea, a new look, a new sex, a new pair of underwear.’ And as I hinted before, it turns out that fashion and videogames have some uncanny similarities.

Zombies started their life in the Hollywood of the 1930s and ‘40s as simplistic stand-ins for racist xenophobia. Post-millennial zombies have been hot-rodded by Danny Boyle and made into a subversive form of utopia. That grim utopianism was globalized by Max Brooks, and now Brad Pitt and his partners are working to transform it into a global franchise. But if zombies are to stay relevant, it will rely on the shambling monsters’ ability to stay subversive – and real subversive shocks and terror are not dystopian. They are utopian.

Ironically, our bodies now must make physical contact with devices dictating access to the real; Apple’s Touch ID sensor can discern for the most part if we are actually alive. This way, we don’t end up trying to find our stolen fingers on the black market, or prevent others from 3D scanning them to gain access to our lives.

This is a monumental shift from when Apple released its first iPhone just six years ago. It’s a touchy subject: fingerprinting authentication means we confer our trust in an inanimate object to manage our animate selves – our biology is verified, digitised, encrypted, as they are handed over to our devices.

Can you really buy heroin on the Web as easily as you might purchase the latest best-seller from Amazon? Not exactly, but as the FBI explained in its complaint, it wasn’t exactly rocket science, thanks to Tor and some bitcoins. Here’s a rundown of how Silk Road worked before the feds swooped in.

  • Henry Jenkins posted the transcript of an interview with Mark J.P. Wolf. The theme of the discussion is “imaginary worlds,” and they touch upon the narratology vs. ludology conflict in gaming:

The interactivity vs. storytelling debate is really a question of the author saying either “You choose” (interaction) or “I choose” (storytelling) regarding the events experienced; it can be all of one or all of the other, or some of each to varying degrees; and even when the author says “You choose”, you are still choosing from a set of options chosen by the author.  So it’s not just a question of how many choices you make, but how many options there are per choice.  Immersion, however, is a different issue, I think, which does not always rely on choice (such as immersive novels), unless you want to count “Continue reading” and “Stop reading” as two options you are constantly asked to choose between.

Inside Korea’s gaming culture, virtual worlds and economic modeling, Hollywood’s Summer of Doom continues, and more

  • I’ve long been fascinated by the gaming culture in South Korea, and Tom Massey has written a great feature piece for Eurogamer titled Seoul Caliber: Inside Korea’s Gaming Culture. From this westerner’s perspective, having never visited Korea, the article reads almost more like cyberpunk fiction than games journalism:

Not quite as ubiquitous, but still extremely common, are PC Bangs: LAN gaming hangouts where 1000 Won nets you an hour of multiplayer catharsis. In Gangnam’s Maxzone, overhead fans rotate at Apocalypse Now speed, slicing cigarette smoke as it snakes through the blades. Korea’s own NCSoft, whose European base is but a stone’s throw from the Eurogamer offices, is currently going strong with its latest MMO, Blade & Soul.

“It’s relaxing,” says Min-Su, sipping a Milkis purchased from the wall-mounted vending machine. “And dangerous,” he adds. “It’s easy to lose track of time playing these games, especially when you have so much invested in them. I’m always thinking about achieving the next level or taking on a quick quest to try to obtain a weapon, and the next thing I know I’ve been here for half the day.”

HK cap

Creation and simulation in virtual worlds appear to offer the best domain to test the new ideas required to tackle the very real problems of depravation, inequality, unemployment, and poverty that exist in national economies. On that note the need to see our socioeconomic institutions for the games that they really are seems even more poignant.

In the words of Vili Lehdonvirta, a leading scholar in virtual goods and currencies, the suffering we see today is “not some consequence of natural or physical law” it instead “is a result of the way we play these games.”

The global economy seems to be bifurcating into a rich/tech track and a poor/non-tech track, not least because new technology will increasingly destroy/replace old non-tech jobs. (Yes, global. Foxconn is already replacing Chinese employees with one million robots.) So far so fairly non-controversial.

The big thorny question is this: is technology destroying jobs faster than it creates them?

[…]

We live in an era of rapid exponential growth in technological capabilities. (Which may finally be slowing down, true, but that’s an issue for decades hence.) If you’re talking about the economic effects of technology in the 1980s, much less the 1930s or the nineteenth century, as if it has any relevance whatsoever to today’s situation, then you do not understand exponential growth. The present changes so much faster that the past is no guide at all; the difference is qualitative, not just quantitative. It’s like comparing a leisurely walk to relativistic speeds.

We begin with a love story–from a man who unwittingly fell in love with a chatbot on an online dating site. Then, we encounter a robot therapist whose inventor became so unnerved by its success that he pulled the plug. And we talk to the man who coded Cleverbot, a software program that learns from every new line of conversation it receives…and that’s chatting with more than 3 million humans each month. Then, five intrepid kids help us test a hypothesis about a toy designed to push our buttons, and play on our human empathy. And we meet a robot built to be so sentient that its creators hope it will one day have a consciousness, and a life, all its own.

“These outages are absolutely going to continue,” said Neil MacDonald, a fellow at technology research firm Gartner. “There has been an explosion in data across all types of enterprises. The complexity of the systems created to support big data is beyond the understanding of a single person and they also fail in ways that are beyond the comprehension of a single person.”

From high volume securities trading to the explosion in social media and the online consumption of entertainment, the amount of data being carried globally over the private networks, such as stock exchanges, and the public internet is placing unprecedented strain on websites and on the networks that connect them.

What I want is systems that have intrinsic rewards; that are disciplines similar to drawing or playing a musical instrument. I want systems which are their own reward.

What videogames almost always give me instead are labor that I must perform for an extrinsic reward. I want to convince you that not only is this not what I want, this isn’t really what anyone wants.

This ‘celebrification’ is enlivening making games and giving players role models, drawing more people in to development, especially indie and auteured games. This shift is proving more prosperous than any Skillset-accredited course or government pot could ever hope for. We are making men sitting in pants at their laptops for 12 hours a day as glamorous as it could be.

Creating luminaries will lead to all the benefits that more people in games can bring: a bigger and brighter community, plus new and fresh talent making exciting games. However, celebritydom demands storms, turmoil and gossip.

Spielberg’s theory is essentially that a studio will eventually go under after it releases five or six bombs in a row. The reason: budgets have become so gigantic. And, indeed, this summer has been full of movies with giant budgets and modest grosses, all of which has elicited hand-wringing about financial losses, the lack of a quality product (another post-apocalyptic thriller? more superheroes?), and a possible connection between the two. There has been some hope that Hollywood’s troubles will lead to a rethinking of how movies get made, and which movies get greenlit by studio executives. But a close look at this summer’s grosses suggest a more worrisome possibility: that the studios will become more conservative and even less creative.