Category: News

Mind-controlled exoskeleton opens World Cup; AI will crash the stock market; Cortana’s personality

exosoc

The exoskeleton — a system comprising a helmet implanted with a microchip that sticks out from the underside; a T-shirt loaded with sensors; metal leg braces; and a battery worn in a backpack — is set in motion when the user envisions himself making the kick. The chip translates those electronic commands to a digital language that powers the skeleton, which then moves accordingly. The T-shirt vibrates to enhance the user’s sensation of movement (and eliminate the need to look at his feet to see if he’s stepping forward).

Talk about dropping the ball. Earlier today, Juliano Pinto — a 29 year-old paraplegic — successfully kicked off the 2014 FIFA World Cup by using a mind-controlled exoskeleton. But sadly, most TV networks failed to show it.

After months of hype, the official broadcast of the opening ceremonies showed only a fraction of it, while some TV networks missed the event altogether. Commentators criticized the organizers for casting aside the moment in favor of performing acts.

The invasion of high-frequency trading machines is now forcing capitalism far away from anything either Adam Smith or the founders of the NYSE could possibly find virtuous. 

We’re not about to let robots compete in the Olympics, driverless cars race in the Indianapolis 500, or automated machines play sports like football, basketball, or baseball. So why is it we allow them to play a role in the most valuable contest of all, the world wide stock exchange? 

With crude forms of AI now entering the quant manipulator’s toolbox, we are now teetering dangerously close to a total collapse of the stock market, one that will leave many corporations and individuals financially destitute.

  • Microsoft has announced their version of apple’s Siri virtual assistant. Named Cortana, after the AI character from the Halo video game series, she is coming to Windows smartphones, and as Brad Molen at engadget reports, developers programmed her with a distinct personality:

Confident, caring, competent, loyal; helpful, but not bossy: These are just some of the words Susan Hendrich, the project manager in charge of overseeing Cortana’s personality, used to describe the program’s most significant character traits. “She’s eager to learn and can be downright funny, peppering her answers with banter or a comeback,” Hendrich said. “She seeks familiarity, but her job is to be a personal assistant.” With that kind of list, it sure sounds like Hendrich’s describing a human. Which is precisely what she and her team set out to do during Cortana’s development; create an AI with human-like qualities.

Microsoft’s decision to infuse Cortana with a personality stemmed from one end goal: user attachment. “We did some research and found that people are more likely to interact with [AI] when it feels more human,” said Hendrich. To illustrate that desired human-machine dynamic, Hendrich pointed to her grandmother’s experience with a Roomba vacuum: “She gave a name and a personality to an inanimate object, and it brought her joy.” That sense of familiarity is exactly what Microsoft wants Window Phone users to feel when interacting with Cortana on their own devices.

Ernesto Laclau dies

  • Ernesto Laclau, post-Marxist critical theorist and significant figure in discourse analysis (along with his wife and collaborator Chantal Mouffe), died on April 13.

Ernesto and Chantal used the work of Antonio Gramsci to reject what they saw as the reductionism and teleology of much Marxist theory. Though sometimes calling himself a ‘post-Marxist’ and an advocate of ‘radical democracy’, Ernesto insisted that he remained a radical anti-imperialist and anti-capitalist. His criticisms of Marx and Marxism were made in a constructive spirit, and without a hint of rancour.

Ernesto was recognised as leading thinker in Latin America but also as an intellectual star in the academic world, co-authoring Contingency, Hegemony and Universality with Slavoj Žižek and Judith Butler in 2008. He gave courses at a string of leading universities in Europe and the Americas, including North Western and the New School for Social Research. Ernesto became Emeritus professor at Essex in 2003, but the Centre he established continues its work.

With collaborators including his wife, Chantal Mouffe, and the cultural theorist Stuart Hall, Laclau played a key role in reformulating Marxist theory in the light of the collapse of communism and failure of social democracy. His “post-Marxist” manifesto Hegemony and Socialist Strategy (1985), written with Mouffe, was translated into 30 languages, and sales ran into six figures. The book argued that the class conflict identified by Marx was being superseded by new forms of identity and social awareness. This worried some on the left, including Laclau’s friend Ralph Miliband, who feared that he had lost touch with the mundane reality of class division and conflict, but his criticisms of Marx and Marxism were always made in a constructive spirit.

Political populism was an enduring fascination for Laclau. His first book, Politics and Ideology in Marxist Theory (1977), offered a polite but devastating critique of the conventional discourse on Latin America at the time. This “dependency” approach tended to see the large landowners – latifundistas – as semi-feudal and pre-capitalist, while Laclau showed them to be part and parcel of Latin American capitalism which fostered enormous wealth and desperate poverty.

Witnessing the impact of the Perónist movement in Argentina led Professor Laclau to a fascination with populism. He wrote a celebrated essay on the subject in the 1970s and then a full-length book, On Populist Reason (2005), looking at the rise of leftist politicians such as Hugo Chávez across much of Latin America. Both the current president of Argentina, Cristina Fernández de Kirchner, and her late husband and predecessor Néstor Kirchner, are said to have been great admirers of his work.

Laclau’s theory of populism has played a critical role in my research. Without his theoretical insights and captivating character, I could not have expanded my initial observations of populist practices to this level.  Beside his theoretical legacy and rich intellectual input outside academia, Prof. Laclau also contributed to the training and development of students and researchers from different parts of the world – thanks to the IDA programme he founded.  His death is a great loss.

Inside Korea’s gaming culture, virtual worlds and economic modeling, Hollywood’s Summer of Doom continues, and more

  • I’ve long been fascinated by the gaming culture in South Korea, and Tom Massey has written a great feature piece for Eurogamer titled Seoul Caliber: Inside Korea’s Gaming Culture. From this westerner’s perspective, having never visited Korea, the article reads almost more like cyberpunk fiction than games journalism:

Not quite as ubiquitous, but still extremely common, are PC Bangs: LAN gaming hangouts where 1000 Won nets you an hour of multiplayer catharsis. In Gangnam’s Maxzone, overhead fans rotate at Apocalypse Now speed, slicing cigarette smoke as it snakes through the blades. Korea’s own NCSoft, whose European base is but a stone’s throw from the Eurogamer offices, is currently going strong with its latest MMO, Blade & Soul.

“It’s relaxing,” says Min-Su, sipping a Milkis purchased from the wall-mounted vending machine. “And dangerous,” he adds. “It’s easy to lose track of time playing these games, especially when you have so much invested in them. I’m always thinking about achieving the next level or taking on a quick quest to try to obtain a weapon, and the next thing I know I’ve been here for half the day.”

HK cap

Creation and simulation in virtual worlds appear to offer the best domain to test the new ideas required to tackle the very real problems of depravation, inequality, unemployment, and poverty that exist in national economies. On that note the need to see our socioeconomic institutions for the games that they really are seems even more poignant.

In the words of Vili Lehdonvirta, a leading scholar in virtual goods and currencies, the suffering we see today is “not some consequence of natural or physical law” it instead “is a result of the way we play these games.”

The global economy seems to be bifurcating into a rich/tech track and a poor/non-tech track, not least because new technology will increasingly destroy/replace old non-tech jobs. (Yes, global. Foxconn is already replacing Chinese employees with one million robots.) So far so fairly non-controversial.

The big thorny question is this: is technology destroying jobs faster than it creates them?

[…]

We live in an era of rapid exponential growth in technological capabilities. (Which may finally be slowing down, true, but that’s an issue for decades hence.) If you’re talking about the economic effects of technology in the 1980s, much less the 1930s or the nineteenth century, as if it has any relevance whatsoever to today’s situation, then you do not understand exponential growth. The present changes so much faster that the past is no guide at all; the difference is qualitative, not just quantitative. It’s like comparing a leisurely walk to relativistic speeds.

We begin with a love story–from a man who unwittingly fell in love with a chatbot on an online dating site. Then, we encounter a robot therapist whose inventor became so unnerved by its success that he pulled the plug. And we talk to the man who coded Cleverbot, a software program that learns from every new line of conversation it receives…and that’s chatting with more than 3 million humans each month. Then, five intrepid kids help us test a hypothesis about a toy designed to push our buttons, and play on our human empathy. And we meet a robot built to be so sentient that its creators hope it will one day have a consciousness, and a life, all its own.

“These outages are absolutely going to continue,” said Neil MacDonald, a fellow at technology research firm Gartner. “There has been an explosion in data across all types of enterprises. The complexity of the systems created to support big data is beyond the understanding of a single person and they also fail in ways that are beyond the comprehension of a single person.”

From high volume securities trading to the explosion in social media and the online consumption of entertainment, the amount of data being carried globally over the private networks, such as stock exchanges, and the public internet is placing unprecedented strain on websites and on the networks that connect them.

What I want is systems that have intrinsic rewards; that are disciplines similar to drawing or playing a musical instrument. I want systems which are their own reward.

What videogames almost always give me instead are labor that I must perform for an extrinsic reward. I want to convince you that not only is this not what I want, this isn’t really what anyone wants.

This ‘celebrification’ is enlivening making games and giving players role models, drawing more people in to development, especially indie and auteured games. This shift is proving more prosperous than any Skillset-accredited course or government pot could ever hope for. We are making men sitting in pants at their laptops for 12 hours a day as glamorous as it could be.

Creating luminaries will lead to all the benefits that more people in games can bring: a bigger and brighter community, plus new and fresh talent making exciting games. However, celebritydom demands storms, turmoil and gossip.

Spielberg’s theory is essentially that a studio will eventually go under after it releases five or six bombs in a row. The reason: budgets have become so gigantic. And, indeed, this summer has been full of movies with giant budgets and modest grosses, all of which has elicited hand-wringing about financial losses, the lack of a quality product (another post-apocalyptic thriller? more superheroes?), and a possible connection between the two. There has been some hope that Hollywood’s troubles will lead to a rethinking of how movies get made, and which movies get greenlit by studio executives. But a close look at this summer’s grosses suggest a more worrisome possibility: that the studios will become more conservative and even less creative.

Epic EVE battle, Critical games criticism, indie developer self-publishing

  • I’ve never played EVE Online, and I don’t even really understand how it works, but I find it fascinating. Last week saw the biggest battle in the game’s history. This breakdown from The Verge is headlined like a real-life dispatch from the frontier of mankind’s space-faring endeavors: Largest space battle in history claims 2,900 ships, untold virtual lives

Update, 9:18PM ET: The battle is over. After more than five hours of combat, the CFC has defeated TEST Alliance. Over 2,900 ships were destroyed today in the largest fleet battle in Eve Online’s history. TEST Alliance intended to make a definitive statement in 6VDT, but their defeat at the hands of the CFC was decisive and will likely result in TEST’s withdrawal from the Fountain region.

In a conversation with Whitten, he told us that the commitment to independent developers is full. There won’t be restrictions on the type of titles that can be created, nor will there be limits in scope. In response to a question on whether retail-scale games could be published independently, Whitten told us, “Our goal is to give them access to the power of Xbox One, the power of Xbox Live, the cloud, Kinect, Smartglass. That’s what we think will actually generate a bunch of creativity on the system.” With regard to revenue splitting with developers, we were told that more information will be coming at Gamescom, but that we could think about it “generally like we think about Marketplace today.” According to developers we’ve spoken with, that split can be approximately 50-50.

Another difference between the Xbox One and Xbox 360 is how the games will be published and bought by other gamers. Indie games will not be relegated to the Xbox Live Indie Marketplace like on the Xbox 360 or required to have a Microsoft-certified publisher to distribute physically or digitally outside the Indie Marketplace. All games will be featured in one big area with access to all kinds of games.

If anything has hurt modern video game design over the past several years, it has been the rise of ‘freemium‘. It seems that it is rare to see a top app or game in the app stores that has a business model that is something other than the ‘free-to-play with in-app purchases’ model. It has been used as an excuse to make lazy, poorly designed games that are predicated on taking advantage of psychological triggers in its players, and will have negative long term consequences for the video game industry if kept unchecked.

Many freemium games are designed around the idea of conditioning players to become addicted to playing the game. Many game designers want their games to be heavily played, but in this case the freemium games are designed to trigger a ‘reward’ state in the player’s brain in order to keep the player playing (and ultimately entice the user to make in-app purchases to continue playing). This type of conditioning is often referred to as a ‘Skinner box‘, named after the psychologist that created laboratory boxes used to perform behavioral experiments on animals.

It obviously isn’t beyond the realm of possibility that, not only do financial considerations influence a game’s structure and content, financial outcomes affect a studio’s likelihood of survival in the industry, based upon the machinations of its publishing overlords. Activision killed Bizarre Creations, Eidos ruined Looking Glass Studios, EA crushed Westood, Pandemic, Bullfrog, Origin Systems… well, the list could go on, until I turn a strange, purple color, but you get my point. And, when 3.4 million copies sold for a Tomb Raider reboot isn’t enough by a publisher’s standards, you can’t help but feel concern for a developer’s future.

This relationship between environment-learner-content interaction and transfer puts teachers in the unique position to capitalize on game engagement to promote reflection that positively shapes how students tackle real-world challenges. To some, this may seem like a shocking concept, but it’s definitely not a new one—roleplay as instruction, for example, was very popular among the ancient Greeks and, in many ways, served as the backbone for Plato’s renowned Allegory of the Cave. The same is true of Shakespeare’s works, 18th and 19th century opera, and many of the novels, movies, and other media that define our culture. More recently, NASA has applied game-like simulations to teach astronauts how to maneuver through space, medical schools have used them to teach robotic surgery, and the Federal Aviation Administration has employed them to test pilots.

The relationship between the creator, the product, and the audience, are all important contexts to consider during media analysis, especially with games. This is because the audience is an active participant in the media. So if you are creating a game you always have to keep in mind the audience. Even if you say the audience doesn’t matter to you, it won’t cease to exist, and it does not erase the impact your game will have.

Similarly, if you are critiquing or analyzing any media, you can’t ignore the creator and the creator’s intentions. Despite those who claim the “death of the author,” if the audience is aware of the creator’s intentions, it can affect how they perceive the game. Particularly, if you consider the ease in which creators can release statements talking about their work, you’ll have an audience with varying levels of awareness about the creator’s intentions. These factors all play off of each other–they do not exist in a vacuum.

When we talk about any medium’s legitimacy, be it film or videogames or painting, it’s a very historical phenomenon that is inextricably tied to its artness that allows for them to get in on the ground floor of “legitimate” and “important.” So if we contextualize the qualities that allowed for film or photography to find themselves supported through a panoply of cultural institutions it was a cultural and political economic process that lead them there.

[…]

Videogames, the kind that would be written about in 20 dollar glossy art magazines, would be exactly this. When creators of videogames want to point to their medium’s legitimacy, it would help to have a lot of smart people legitimate your work in a medium (glossy magazines, international newspapers) that you consider to be likewise legitimate. Spector concedes that ‘yes all the critics right now are online’, but the real battle is in getting these critics offline and into more “legitimate” spaces of representation. It’s a kind of unspoken hierarchy of mediums that is dancing before us here: at each step a new gatekeeper steps into play, both legitimating and separating the reader from the critic and the object of criticism.

All three games define fatherhood around the act of protection, primarily physical protection. And in each of these games, the protagonist fails—at least temporarily—to protect their ward. In Ethan’s case, his cheery family reflected in his pristine home collapses when he loses a son in a car accident. Later, when his other son goes missing, the game essentially tests Ethan’s ability to reclaim his protective-father status.

No video game grants absolute freedom; they all have rules or guidelines that govern what you can and can’t do. The sci-fi epic Mass Effect is a series that prides itself on choice, but even that trilogy ends on a variation of choosing between the “good” and “bad” ending. Minecraft, the open-world creation game, is extremely open-ended, but you can’t build a gun or construct a tower into space because it doesn’t let you. BioShock’s ending argues that the choices you think you’re making in these games don’t actually represent freedom. You’re just operating within the parameters set by the people in control, be they the developers or the guy in the game telling you to bash his skull with a golf club.

BioShock’s disappointing conclusion ends up illustrating Ryan’s point. A man chooses, a player obeys. It’s a grim and cynical message that emphasizes the constraints of its own art form. And given that the idea of choice is so important to BioShock’s story, I don’t think it could’ve ended any other way.

Zimmerman media coverage, remediation in Game of Thrones, Scorcese on reading cinema, and much more

The reports are based on an ABC News interview with Juror B29, the sole nonwhite juror. She has identified herself only by her first name, Maddy. She’s been framed as the woman who was bullied out of voting to convict Zimmerman. But that’s not true. She stands by the verdict. She yielded to the evidence and the law, not to bullying. She thinks Zimmerman was morally culpable but not legally guilty. And she wants us to distinguish between this trial and larger questions of race and justice.

ABC News hasn’t posted a full unedited video or transcript of the interview. The video that has been broadcast—on World News Tonight, Nightline, and Good Morning America—has been cut and spliced in different ways, often so artfully that the transitions appear continuous. So beware what you’re seeing. But the video that’s available already shows, on closer inspection, that Maddy has been manipulated and misrepresented. Here are the key points.

In the recording heard by NBC viewers, Zimmerman appeared to volunteer the information, “This guy looks like he’s up to no good. He looks black.”

Edited out was the 911 dispatcher asking Zimmerman if the person he was suspicious of was “black, white or Hispanic,” to which Zimmerman had responded, “He looks black.”

Though Zimmerman and his attorneys have filed a lawsuit against NBC News for the malicious editing of the 911 tape, what CNN did is far worse.

NBC News was attempting to make Zimmerman look like a racial profiler. CNN, on the other hand, was attempting to make Zimmerman look like an enraged outright racist (there was no racial angle in ABC’s fraud). It also took CNN far longer to retract their story than either NBC or ABC.

Moreover, on its own airwaves, CNN would allow the complete fallacy that Zimmerman had said “fucking coon” to live on.

Pulling teeth doesn’t do justice to the painful viewing experience accompanying this sort of news manufacture – making news from no news. Even the daily palaver known as Changing the Guard was spun to look like an integral prelude to the long-awaited arrival. And the waiting went on, and on, and on, and the longer it went on, the more desperate and dull the coverage became. Sometimes people complain about the high salaries enjoyed by news presenters, especially the public service variety, but by golly they earnt their crust trying, albeit failing, to sustain the suspense.

Light is at the beginning of cinema, of course. It’s fundamental—because cinema is created with light, and it’s still best seen projected in dark rooms, where it’s the only source of light. But light is also at the beginning of everything. Most creation myths start with darkness, and then the real beginning comes with light—which means the creation of forms. Which leads to distinguishing one thing from another, and ourselves from the rest of the world. Recognizing patterns, similarities, differences, naming things—interpreting the world. Metaphors—seeing one thing “in light of” something else. Becoming “enlightened.” Light is at the core of who we are and how we understand ourselves.

[…]

Or consider the famous Stargate sequence from Stanley Kubrick’s monumental 2001: A Space Odyssey. Narrative, abstraction, speed, movement, stillness, life, death—they’re all up there. Again we find ourselves back at that mystical urge—to explore, to create movement, to go faster and faster, and maybe find some kind of peace at the heart of it, a state of pure being.

Despite stormy forecasts, Hollywood appears to be too unwieldly or too unwilling to shift direction towards smaller, cheaper pictures. Guests at Comic-Con learned about upcoming studio productions including Pirates of the Caribbean 5, Thor 2, Fantastic Four 3 and a reboot of Godzilla. The director Joss Whedon came to the event to lament that “pop culture is eating itself” and called for “new universes, new messages and new icons”. He then revealed the title of his next film to be Avengers: Age of Ultron.

Repeat after me: Edward Snowden is not the story. The story is what he has revealed about the hidden wiring of our networked world. This insight seems to have escaped most of the world’s mainstream media, for reasons that escape me but would not have surprised Evelyn Waugh, whose contempt for journalists was one of his few endearing characteristics. The obvious explanations are: incorrigible ignorance; the imperative to personalise stories; or gullibility in swallowing US government spin, which brands Snowden as a spy rather than a whistleblower.

The video site is aiming to showcase some geek culture by pronouncing 4-10 August its first ever ‘Geek Week’ and promoting some of the genre’s top channels which cover everything from sci-fi to comics, gaming and superheroes. To do this, its own channel will be featuring videos from users like Nerdist, the official Doctor Who channel, MinutePhysics and more than a hundred others, with every day of the week hosted by a different user. It’ll even include the first trailer for the new Thor movie, The Dark World.

That said, things kept nagging me. Blackfish does raise some valuable secondary issues – how SeaWorld markets itself, how labor issues are at stake in addition to environmental ones – but as a spectator I kept wanting the film to pursue lines of analysis that it would suggest but never develop.

[…]

In short, if there’s an ur-ideology to the American progressive documentary, it’s that demand-side drivers of political situations (Gramsci’s hegemony, ideology, what have you) don’t matter, it’s merely the supply side of oligopoly, big money, and corporate control. Or to be less political, as a film scholar I can’t help but notice than in a film about the business of spectacle, the spectator is both crucial (SeaWorld viewers provide the vital footage of the incidents) and completely effaced.

And what of the YouTube creator? How has AdSense helped or hindered their careers? In most cases, the advertising structure has been a blessing to creators as it’s allowed them to launch careers solely through YouTube. AdSense gave us a new type of celebrity for a new generation.

Creators have had their fair share of AdSense woes in the past, though. Last year, one of YouTube’s biggest names, Ray William Johnson,entered a very public dispute with Maker Studios. Johnson claimed that Maker Studios was holding his AdSense account “hostage” even after he had terminated his contract with them.

If you watch big budget entertainments, there’s no escaping these sorts of moments. The trope familiar to the Scooby-Doo generation, in which a few nagging uncertainties are resolved with a “there’s just one thing I don’t understand” kickoff, has now become a motif. Characters must constantly address questions on behalf of a too-curious audience awash in complexly-plotted mega-stories. The movies are trying to plug leaks in a boat before the whole thing sinks—never quite repairing it, but doing just enough to get by.

What I’m talking about here is the unavoidable shift that occurs when content is remediated—that is, borrowed from one medium and reimagined in another. In this case, the content of the book series A Song of Ice and Fire (ASOIAF) is remediated to Game of Thrones, the HBO television series. Some of the differences in this instance of remediation seem pragmatic—remembrances are turned into scenes of their own, dialogue is shortened, characters omitted or altered for the sake of brevity and clarity. I am no purist, and I recognize that with remediation comes necessary alteration for the content to suit the new medium. But other differences speak volumes about our cultural biases and expectations surrounding those with socially-othered bodies—like Tyrion, Sam, and, of course, women. What can we say about these differences? And perhaps more importantly, what do they say about us?

Why does it matter what Kubrick liked? For years I’ve enjoyed unearthing as much information as I can about his favourite films and it slowly became a personal hobby. Partly because each time I came across such a film (usually from a newly disclosed anecdote – thanks internet! – or Taschen’s incredible The Stanley Kubrick Archives book) I could use it as a prism to reveal more about his sensibilities. My appreciation of both him and the films he liked grew. These discoveries led me on a fascinating trail, as I peppered them throughout the 11 existing Kubrick features (not counting the two he disowned) I try to watch every couple of years. I’m sure a decent film festival could be themed around the Master List at the end of this article…

  • Lastly, the Media Ecology Association has uploaded some videos from their latest annual convention which was held in June. These include Dominique Scheffel-Dunand on canonic texts in media ecology and Lance Strate’s talk “If not A, then E“.

Google settles over privacy violations, Social media segregation, the era of big data, and more…

  • Google is reportedly reaching a settlement with the Federal Trade Commission over an incident in which the Internet search giant violated an agreement with the FTC by tracking Safari users’ data. From the Associated Press:

Google is poised to pay a $22.5 million fine to resolve allegations that it broke a privacy promise by secretly tracking millions of Web surfers who rely on Apple’s Safari browser, according to a person familiar with settlement.

If approved by the FTC’s five commissioners, the $22.5 million penalty would be the largest the agency has ever imposed on a single company.

  • Adrianna Jeffries at BetaBeat covers a BBC report on how users of specific web sites break down along racial demographics. The article misleadingly refers to “segregation” in social media, but the information and analysis by danah boyd is interesting:

Pinterest is 70 percent female and 79 percent white, according to the BBC. By contrast, black and Latino users are overrepresented on Twitter versus the general population.

Ms. Boyd theorized that there was an exodus of users from Myspace to Facebook similar to white flight to the suburbs when the U.S. desegregated schools. Facebook, the vanilla of social media sites, was approaching the makeup of the U.S. population at the time of an analysis done in 2009. That was the year that white users stopped being overrepresented and black and Latino users stopped being underrepresented.

Among companies of more than 1,000 employees in 15 out of the economy’s 17 sectors, the average amount of data is a surreal 235 terabytes. That’s right — each of these companies has more info than the Library of Congress. And so, why should we care? Because data is valuable. The growth of digital networks and the networked sensors in everything from phones to cars to heavy machinery mean that data has a reach and sweep it has never had before. The key to Big Data is connecting these sensors to computing intelligence which can make sense of all this information (in pure Wall-E style, some theorists call this the Internet of Things).

  • This short post at Kethu.org presents survey data and rhetorically wonders whether social media behaviors negatively impact life enjoyment:

Consider this: 24% of respondents to one survey said they’ve missed out on enjoying special moments in person because — ironically enough — they were too busy trying to document their experiences for online sharing. Many of us have had to remind ourselves to “live in the now” — instead of worry about composing the perfect tweet or angling for just the right Instagram shot.

I’m coming to believe that classroom time is too limiting in the teaching of tools. At CUNY, we’ve seen over the years that students come in with widening gulfs in both their prior experience and their future ambitions in tools and technologies. My colleagues at CUNY, led by Sandeep Junnarkar, have implemented many new modules and courses to teach such topics as data journalism (gathering, analysis, visualization) and familiarity with programming.

Note well that I have argued since coming to CUNY that we should not and cannot turn out coders. I also do not subscribe to the belief that journalism’s salvation lies in hunting down that elusive unicorn, the coder-journalism, the hack-squared. I do believe that journalists must become conversant in technologies, sufficient to enable them to (a) know what’s possible, (b) specify what they want, and (c) work with the experts who can create that.

in medias res: bridging the “time sap” gap, DIY politics, Google thinks you’re stupid, and more

  • When researchers started using the term “digital divide” in the 1990s they were referring to an inequality of access to the Internet and other ICTs. Over time the issue shifted from unequal access to emphasizing disparities of technological competency across socioeconomic sectors. The new manifestation of the digital divide, according to a New York Times article, is reflected in whether time on the Internet is spent being productive, or wasting time:

As access to devices has spread, children in poorer families are spending considerably more time than children from more well-off families using their television and gadgets to watch shows and videos, play games and connect on social networking sites, studies show.

The new divide is such a cause of concern for the Federal Communications Commission that it is considering a proposal to spend $200 million to create a digital literacy corps. This group of hundreds, even thousands, of trainers would fan out to schools and libraries to teach productive uses of computers for parents, students and job seekers.

A study published in 2010 by the Kaiser Family Foundation found that children and teenagers whose parents do not have a college degree spent 90 minutes more per day exposed to media than children from higher socioeconomic families. In 1999, the difference was just 16 minutes.

  • In an op-ed for the LA Times Neal Gabler writes that Obama’s legacy may be disillusionment with partisan politics and a shift toward do-it-yourself democracy:

Disillusionment with partisan politics is certainly nothing new. Obama’s fall from grace, however, may look like a bigger belly flop because his young supporters saw him standing so much higher than typical politicians. Yet by dashing their hopes, Obama may actually have accomplished something so remarkable that it could turn out to be his legacy: He has redirected young people’s energies away from conventional electoral politics and into a different, grass-roots kind of activism. Call it DIY politics.

We got a taste of DIY politics last fall with the Occupy Wall Street sit-ins, which were a reaction to government inaction on financial abuses, and we got another taste when the 99% Spring campaign mobilized tens of thousands against economic inequality. OWS and its tangential offshoots may seem political, but it is important to note that OWS emphatically isn’t politics as usual. It isn’t even a traditional movement.

  • In a piece on The Daily Beast Andrew Blum, author of a new net-centric book titled The Tubes: A Journey to the Center of the Internet, details the condescension and furtiveness he encountered while researching Google for his book:

Walking past a large data center building, painted yellow like a penitentiary, I asked what went on inside. Did this building contain the computers that crawl through the Web for the search index? Did it process search queries? Did it store email? “You mean what The Dalles does?” my guide responded. “That’s not something that we probably discuss. But I’m sure that data is available internally.” (I bet.) It was a scripted non-answer, however awkwardly expressed. And it might have been excusable, if the contrast weren’t so stark with the dozens of other pieces of the Internet that I visited. Google was the outlier—not only for being the most secretive but the most disingenuous about that secrecy.

After my tour of Google’s parking lot, I joined a hand-picked group of Googlers for lunch in their cafeteria overlooking the Columbia River. The conversation consisted of a PR handler prompting each of them to say a few words about how much they liked living in The Dalles and working at Google. (It was some consolation that they were treated like children, too.) I considered expressing my frustration at the kabuki going on, but I decided it wasn’t their choice. It was bigger than them. Eventually, emboldened by my peanut-butter cups, I said only that I was disappointed not to have the opportunity to go inside a data center and learn more. My PR handler’s response was immediate: “Senators and governors have been disappointed too!”

When news reports focus on individuals and their stories, rather than simply facts or policy, readers experience greater feelings of compassion, said Penn State Distinguished Professor Mary Beth Oliver, co-director of the Media Effects Research Laboratory and a member of the Department of Film-Video and Media Studies. This compassion also extends to feelings about social groups in general, including groups that are often stigmatized.

“Issues such as health care, poverty and discrimination all should elicit compassion,” Oliver said. “But presenting these issues as personalized stories more effectively evokes emotions that lead to greater caring, willingness to help and interest in obtaining more information.”

The emphasis on “personalized stories” reminds me of Zillmann’s exemplification theory, though the article makes no mention of exemplification.

The problem with living through a revolution is that you’ve no idea how things will turn out. So it is with the revolutionary transformation of our communications environment driven by the internet and mobile phone technology. Strangely, our problem is not that we are short of data about what’s going on; on the contrary we are awash with the stuff. This is what led Manuel Castells, the great scholar of cyberspace, to describe our current mental state as one of “informed bewilderment”: we have lots of information, but not much of a clue about what it means.

If, however, you’re concerned about things such as freedom, control and innovation, then the prospect of a world in which most people access the internet via smartphones and other cloud devices is a troubling one. Why? Because smartphones (and tablets) are tightly controlled, “tethered” appliances. You may think that you own your shiny new iPhone or iPad, for example. But in fact an invisible chain stretches from it all the way back to Apple’s corporate HQ in California. Nothing, but nothing, goes on your iDevice that hasn’t been approved by Apple.

In Medias Res: Chomsky Occupied, lolcats invade aca-meme-ia, the intention economy and more…

  • Microsoft is opening a research lab in New York City staffed by A-list sociologists, computational scientists, and network theorists among others.

The NYC lab recruits bring in mathematical and computation tools that could work magic with existing social media research already underway at Microsoft Research, led by folks like Gen-fluxer danah boyd. “I would say that the highly simplified version of what happens is that data scientists do patterns and ethnographers tell stories,” boyd tells Fast Company. While Microsoft Research New England has strengths in qualitative social science, empirical economics, machine learning, and mathematics, “We’ve long noted the need for data science types who can bridge between us,” boyd explained in a blog post announcing the NYC labs.

So the world is now indeed splitting into a plutonomy and a precariat — in the imagery of the Occupy movement, the 1 percent and the 99 percent. Not literal numbers, but the right picture. Now, the plutonomy is where the action is and it could continue like this.

If it does, the historic reversal that began in the 1970s could become irreversible. That’s where we’re heading. And the Occupy movement is the first real, major, popular reaction that could avert this. But it’s going to be necessary to face the fact that it’s a long, hard struggle. You don’t win victories tomorrow. You have to form the structures that will be sustained, that will go on through hard times and can win major victories. And there are a lot of things that can be done.

  • An article at the Atlantic poses the question: Are LOLCats making us smart? The article quotes Kate Miltner who wrote her dissertation on LOLCat memes:

According to Miltner, “When it came to LOLCats, sharing and creating were often different means to the same end: making meaningful connections with others.” At their core LOLCats weren’t about those funny captions, the weird grammar, or the cute kitties, but people employed those qualities in service of that primary goal of human connection.

A newer idea outgrowth of this is that information is so omnipresent and that consumers face so much of it that businesses are now in a completely different economy model fighting to get people’s attention. This Attentioneconomy has new rules based on how much time people are willing to spend paying attention to some piece of information and to their hopes the advertisements that may surround it. New tools are emerging to analyze not just what is talked about but also sentiment, audience demographics, and how quickly it spreads.

To push efficiency, the better way would be to be able the craft the message more accurately to specific people, not just a demographic: to me personally, not just to ‘people who live in that part of the city’. How would that be possible? It starts with trying to understand the intention of what people want, rather than trying to just grab their attention as they walk away. If we knew, or better yet, if the consumer each told us what they wanted and we could craft the message for each person as well as target exactly who would be interested, then the efficiency of that message suddenly shoots way up. It hinges on that dialogue with the consumer.

Scott Merrill at Tech Crunch also covered Searl’s book:

Another substantial topic of the book is just how incorrect most of the information collected about us actually is. And still this factually wrong data is used to select which advertisements are presented to you, in the hope that you’ll click through. Aside from how intrusive advertising is, is it any surprise that click-through rates are so low when the data used to target ads to viewers is so wildly off-base?

Searls also advocates strongly for Vendor Relationship Management (VRM) solutions to give to consumers the same kind of tracking and information collection about vendors that the vendors use against us. The point of VRM is not adversarial, according to Searls. Instead, it restores balance to the overall market and seeks to actively reward those companies that pay attention to individual intentions.

In medias res: end-of-the-semester reading list

Due to end-of-the-semester activities posting has been slow the last couple of weeks. But my exams are finished and I’ve submitted grades so here’s a celebratory news roundup:

In an interview published Sunday, Google’s co-founder cited a wide range of attacks on “the open internet,” including government censorship and interception of data, overzealous attempts to protect intellectual property, and new communication portals that use web technologies and the internet, but under restrictive corporate control.

There are “very powerful forces that have lined up against the open internet on all sides and around the world,” says Brin. “I thought there was no way to put the genie back in the bottle, but now it seems in certain areas the genie has been put back in the bottle.”

The post-social world is an “attention economy.” If you don’t have engagement, you don’t have attention and if you don’t have attention – well you don’t have anything really.

In the 1970s, the scholar Herbert Simon argued that “in an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients.”

His arguments give rise both to the notion of “information overload” but also to the “attention economy”. In the attention economy, people’s willingness to distribute their attention to various information stimuli create value for said stimuli. Indeed, the economic importance of advertisements is predicated on the notion that getting people to pay attention to something has value.

If one wanted to track three trends likely to have the most impact on international relations over the next decade, what three trends could help us anticipate global political crises? At the top of my news feed are items about who is in jail and why, rigged elections, and social media.

School shootings and domestic terrorism have proliferated on a global level. In recent months there have been school shootings in Finland, Germany, Greece, and other countries as well as the United States. Although there may be stylistic differences, in all cases young men act out their rage through the use of guns and violence to create media spectacles and become celebrities-of-the-moment.

Class dismissed, have a great summer!