- Almetria Vaba of PBS Learning Media has posted a collection of resources for exploring media literacy through the legacy of Dr. Martin Luther King, jr.:
Examine the life and legacy of Dr. Martin Luther King Jr. and the Civil Rights Movement with hundreds of PBS LearningMedia resources. Here is a sampling of resources from the extensive offering in PBS LearningMedia. Use these resources to explore media literacy from historical documentaries to media coverage of social movements.
- Sonia Paul at PBS MediaShift reported on a recent Pew Research study on social media, stress, and the “cost of caring”:
Among the survey’s major findings is that women are much more likely than men to feel stressed after becoming aware of stressful events in the lives of others in their networks.
“Stress is kind of contagious in that way,” said Keith Hampton, an associate professor at Rutgers University and the chief author of the report. “There’s a circle of sharing and caring and stress.”
- Lily Hay Newman reported on the survey for Slate:
In a survey of 1,801 adults, Pew found that frequent engagement with digital services wasn’t directly correlated to increased stress. Women who used social media heavily even recorded lower stress. The survey relied on the Perceived Stress Scale, a widely used stress-measurement tool developed in the early 1980s.
“We began to work fully expecting that the conventional wisdom was right, that these technologies add to stress,” said Lee Rainie, the director of Internet, science, and technology research at Pew. “So it was a real shock when [we] first looked at the data and … there was no association between technology use, especially heavy technology use, and stress.”
- LiveScience writer Elizabeth Palermo looked at the gendered differences found by the study:
The higher incidence of stress among the subset of technology users who are aware of stressful events in the lives of others is something that Hampton and his colleagues call “the cost of caring.”
“You can use these technologies and, as a woman, it’s probably going to be beneficial for your level of stress. But every now and then, bad things are going to happen to people you know, and there’s going to be a cost for that,” Hampton said.
- Nicholas Carr recently penned an editorial for The Guardian considering whether we are becoming too reliant on computers:
The real danger we face from computer automation is dependency. Our inclination to assume that computers provide a sufficient substitute for our own intelligence has made us all too eager to hand important work over to software and accept a subservient role for ourselves. In designing automated systems, engineers and programmers also tend to put the interests of technology ahead of the interests of people. They transfer as much work as possible to the software, leaving us humans with passive and routine tasks, such as entering data and monitoring readouts. Recent studies of the effects of automation on work reveal how easily even very skilled people can develop a deadening reliance on computers. Trusting the software to handle any challenges that may arise, the workers fall victim to a phenomenon called “automation complacency”.
- David Whelan at Vice interviewed Carr on the issue of technology dependency:
Should we be scared of the future?
I think we should be worried of the future. We are putting ourselves passively into the hands of those who design the systems. We need to think critically about that, even as we maintain our enthusiasm of the great inventions that are happening. I’m not a Luddite. I’m not saying we should trash our laptops and run off to the woods.
We’re basically living out Freud’s death drive, trying our best to turn ourselves into inorganic lumps.
Even before Freud, Marx made the point that the underlying desire of technology seemed to be to create animate technology and inanimate humans. If you look at the original radios, they were transmission as well as reception devices, but before long most people just stopped transmitting and started listening.
- Writing at Figure/Ground, John Dowd argues that being there still matters for teaching and learning in the digital age:
From an educational perspective, what we must understand is the relationship between information and meaning. Meaning is not an inevitable outcome of access to information but rather, emerges slowly when one has cultivated his or her abilities to incorporate that information in purposeful and ethical ways. Very often this process requires a slowdown rather than a speedup, the latter of which being a primary bias of many digital technologies. The most powerful educational experiences stem from the relationships formed between teacher and student, peer and peer. A smart classroom isn’t necessarily one that includes the latest technologies, but one that facilitates greater interaction among teachers and students, and responsibility for the environment within which one learns. A smart classroom is thus spatially, not primarily technologically, smart. While the two are certainly not mutually exclusive (and much has been written on both), we do ourselves a disservice when privileging the latter over the former.
- Dowd’s argument here is similar to Carr’s thoughts on MOOCs:
In education, computers are also falling short of expectations. Just a couple of years ago, everyone thought that massive open online courses – Moocs – would revolutionise universities. Classrooms and teachers seemed horribly outdated when compared to the precision and efficiency of computerised lessons. And yet Moocs have largely been a flop. We seem to have underestimated the intangible benefits of bringing students together with a real teacher in a real place. Inspiration and learning don’t flow so well through fibre-optic cables.
- MediaPost editor Steve Smith writes about his relationship with his iPhone, calling it life’s new remote:
The idea that the cell phone is an extension of the self is about as old as the device itself. We all recall the hackneyed “pass your phone to the person next to you” thought experiment at trade shows four or five years ago. It was designed to make the point of how “personally” we take these devices.
And now the extraordinary and unprecedented intimacy of these media devices is a part of legal precedent. The recent Supreme Court ruling limiting searches of cell phone contents grounded the unanimous opinion on an extraordinary observation. Chief Justice John Roberts described these devices as being “such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy.”
We are only beginning to understand the extent to which these devices are blending the functionality of media with that of real world tools. And it is in line with one of Marshall McLuhan’s core observations in his “Understanding Media” book decades ago.
- Tomas Chamorro-Premuzic contributed a piece to The Guardian referencing Carr to consider how technology has downgraded attention:
As early as 1971 Herbert Simon observed that “what information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it”. Thus instead of reaping the benefits of the digital revolution we are intellectually deprived by our inability to filter out sensory junk in order to translate information into knowledge. As a result, we are collectively wiser, in that we can retrieve all the wisdom of the world in a few minutes, but individually more ignorant, because we lack the time, self-control, or curiosity to do it.
There are also psychological consequences of the distraction economy. Although it is too soon to observe any significant effects from technology on our brains, it is plausible to imagine that long-term effects will occur. As Nicholas Carr noted in The Shallows: What the internet is doing to our brains, repeated exposure to online media demands a cognitive change from deeper intellectual processing, such as focused and critical thinking, to fast autopilot processes, such as skimming and scanning, shifting neural activity from the hippocampus (the area of the brain involved in deep thinking) to the prefrontal cortex (the part of the brain engaged in rapid, subconscious transactions). In other words, we are trading speed for accuracy and prioritise impulsive decision-making over deliberate judgment. In the words of Carr: “The internet is an interruption system. It seizes our attention only to scramble it”.
- James Vincent at The Verge covered a recent study that links nighttime screen use with less REM sleep:
The research carried out by the Harvard Medical School and published in the journal Proceedings of the National Academy of Sciences studied the sleeping patterns of 12 volunteers over a two-week period. Each individual read a book before their strict 10PM bedtime — spending five days with an iPad and five days with a paper book. The scientists found that when reading on a lit screen, volunteers took an average of 10 minutes longer to fall asleep and received 10 minutes less REM sleep. Regular blood samples showed they also had lower levels of the sleep hormone melatonin consistent with a circadian cycle delayed by one and a half hour.
- At AdBusters, Douglas Haddow writes that sleep is the enemy of capital:
Ever since the frequent cocaine user and hater of sleep Thomas Edison flicked on the first commercially-viable electric lightbulb, a process has taken hold through which the darkness of sleep time has been systematically deconstructed and illuminated.
Most of us now live in insomniac cities with starless skies, full of twinkling neon signage and flickering gadgets that beg us to stay awake longer and longer. But for all this technological innovation, we still must submit to our diurnal rhythm if we want to stay alive.
And even though sleep may “frustrate and confound strategies to exploit and reshape it,” as Crary says, it, like anything, remains a target of exploitation and reshaping – and in some cases, all-out elimination.
- In an interview with TruthOut to discuss his latest book, Robert McChesney addresses telecommunications monopolies, net neutrality, and advocates radical solutions to systemic problems:
What is striking about this corporate monopolization of the internet is that all the wealth and power has gone to a small number of absolutely enormous firms. As we enter 2015, 13 of the 33 most valuable corporations in the United States are internet firms, and nearly all of them enjoy monopolistic market power as economists have traditionally used the term. If you continue to scan down the list there are precious few internet firms to be found. There is not much of a middle class or even an upper-middle class of internet corporations to be found.
This poses a fundamental problem for democracy, though it is one that mainstream commentators and scholars appear reluctant to acknowledge: If economic power is concentrated in a few powerful hands you have the political economy for feudalism, or authoritarianism, not democracy. Concentrated economic power invariably overwhelms the political equality democracy requires, leading to routinized corruption and an end of the rule of law. That is where we are today in the United States.
- In light of recent terrorist attacks and renewed hysteria about fundamentalist ideologies, I revisited Mark Manson’s essay probing why there seems to be more fundamentalism in the world today:
The short answer is technology. Yes, Facebook really did ruin everything. The explosion in communication technologies over the past decades has re-oriented society and put more psychological strain on us all to find our identities and meaning. For some people, the way to ease this strain is to actually reject complexity and ambiguity for absolutist beliefs and traditional ideals.
Philosopher Charles Taylor wrote that it would be just as difficult to not believe in God in 1500 as it is to believe in God in the year 2000. Obviously, most of humanity believes in God today, but it’s certainly become a much more complicated endeavor. With the emergence of modern science, evolution, liberal democracy, and worldwide 24-hour news coverage of corruption, atrocities, war and religious hypocrisy, today a person of faith has their beliefs challenged more in a week than a person a few generations ago would have in half a lifetime.
- In a post at the Jacobin blog, Anthony Galluzzo considers how the mainstream media’s “fucking hipster” show mocks hipsters in the service of capital:
[Marxist geographer Neil] Smith offers a dry, but emphatically structural account of this process, which he first theorized in the late eighties with Soho and the Lower East Side in mind. Gentrification has since become central to neoliberal urbanization generally, and New York City in particular, under the developer-driven Bloomberg administration.
But why bother with “dry” and “structural” when you can tune-in to the “fucking hipster” show?
Unlike Smith’s rigorous Marxian analysis, most popular accounts from the spurious creative class mystifications of Richard Florida to standard issue conservative populist diatribes forget the larger forces and primary movers in this process, which is instead reduced, metonymically, to the catchall figure of the hipster.
On topics ranging from the capitalist dynamics of gentrification to the casualization of employment among ostensibly middle class Millennials, the “fucking hipster” show beats staid structural analysis every time — even for many members of the self-identified Left.
We should retire “hipster” as a term without referent or political salience. Its zombie-like persistence in anti-hipster discourse must be recognized for what it is: an urbane, and socially acceptable, form of ideologically inflected shaming on the part of American elites who must delegitimize those segments of a largely white, college educated population who didn’t do the “acceptable thing.”
The anti-hipster censure here includes a healthy dose of typically American anti-intellectualism, decked out in liberal bunting, subtle homophobia, and recognizably manipulative appeals to white, middle class resentment, now aimed at the lazy hipster, who either lives on his trust fund or, more perniciously, abuses public assistance, proving how racist templates are multi-use tools.
Our power elites’ rhetorical police action becomes increasingly necessary as large swaths of the people lumped under the hipster taxon slip into the ranks of the long-term un- and underemployed. Once innocuous alternative lifestyles could potentially metamorphosize into something else altogether. Better to frame “alternative lifestyle” in terms of avant-garde trend setting without remainder, providing suitably rarefied consumption options for Bloomberg’s new bourgeoisie, as they buy locally sourced creativity on Bedford Ave.
- Anti-homeless features in urban design became a trending media topic earlier this month after pictures of anti-homeless studs in London were shared on social media. The Mirror reports on the background and the eventual removal of the spikes:
Metal spikes designed to stop homeless people sleeping in the doorway of a London apartment block have been removed, after almost 130,000 people signed a petition calling for them to be taken out.
Pictures of the metal studs outside flats in Southwark Bridge Road were widely shared online last weekend, sparking outrage on social media.
Many criticised the spikes as inhumane, and compared them to those used to stop pigeons landing on buildings.
- An Atlantic article by Robert Rosenberger looks at how cities use design to drive homeless people away:
It has been encouraging to see the outrage over the London spikes. But the spikes that caused the uproar are by no means the only form of homeless-deterrent technology; they are simply the most conspicuous. Will public concern over the spikes extend to other less obvious instances of anti-homeless design? Perhaps the first step lies in recognizing the political character of the devices all around us.
An example of an everyday technology that’s used to forbid certain activities is “skateboard deterrents,” that is, those little studs added to handrails and ledges. These devices, sometimes also called “skatestoppers” or “pig ears,” prevent skateboarders from performing sliding—or “grinding”—tricks across horizontal edges. A small skateboard deterrence industry has developed, with vendors with names like “stopagrind.com” and “grindtoahault.com.”
An example of a pervasive homeless deterrence technology is benches designed to discourage sleeping. These include benches with vertical slats between each seat, individual bucket seats, large armrests between seats, and wall railings which enable leaning but not sitting or lying, among many other designs. There are even benches made to be slightly uncomfortable in order to dissuade people from sitting too long. Sadly, such designs are particularly common in subway, bus stops, and parks that present the homeless with the prospect of a safely public place to sleep.
The London spikes provide an opportunity to put a finger on our own intuitions about issues of homelessness and the design of open space. Ask yourself if you were appalled by the idea of the anti-homeless spikes. If so, then by implication you should have the same problems with other less obvious homeless deterrence designs like the sleep-prevention benches and the anti-loitering policies that target homeless people.
- In the Guardian, Ben Quinn writes that anti-homeless spikes are part of a wider phenomenon of “hostile architecture”:
In addition to anti-skateboard devices, with names such as “pig’s ears” and “skate stoppers”, ground-level window ledges are increasingly studded to prevent sitting, slanting seats at bus stops deter loitering and public benches are divided up with armrests to prevent lying down.
To that list, add jagged, uncomfortable paving areas, CCTV cameras with speakers and “anti-teenager” sound deterrents, such as the playing of classical music at stations and so-called Mosquito devices, which emit irritatingly high-pitched sounds that only teenagers can hear.
The architectural historian Iain Borden says the emergence of hostile architecture has its roots in 1990s urban design and public-space management. The emergence, he said, “suggested we are only republic citizens to the degree that we are either working or consuming goods directly.
“So it’s OK, for example, to sit around as long as you are in a cafe or in a designated place where certain restful activities such as drinking a frappucino should take place but not activities like busking, protesting or skateboarding. It’s what some call the ‘mallification’ of public space, where everything becomes like a shopping mall.”
- Following last month’s post of David Graeber’s views on “bullshit jobs,” this Salon interview with Graeber discusses the failed forecast of universal leisure time:
Right after my original bullshit jobs piece came out, I used to think that if I wanted, I could start a whole career in job counseling – because so many people were writing to me saying “I realize my job is pointless, but how can I support a family doing something that’s actually worthwhile?” A lot of people who worked the information desk at Zuccotti Park, and other occupations, told me the same thing: young Wall Street types would come up to them and say “I mean, I know you’re right, we’re not doing the world any good doing what we’re doing. But I don’t know how to live on less than a six figure income. I’d have to learn everything over. Could you teach me?”
But I don’t think we can solve the problem by mass individual defection. Or some kind of spiritual awakening. That’s what a lot of people tried in the ‘60s and the result was a savage counter-offensive which made the situation even worse. I think we need to attack the core of the problem, which is that we have an economic system that, by its very nature, will always reward people who make other people’s lives worse and punish those who make them better. I’m thinking of a labor movement, but one very different than the kind we’ve already seen. A labor movement that manages to finally ditch all traces of the ideology that says that work is a value in itself, but rather redefines labor as caring for other people.
- In an article for Al Jazeera, Sarah Kendzior surveys the politics of gentrification and the perils of hipster economics:
Proponents of gentrification will vouch for its benevolence by noting it “cleaned up the neighbourhood”. This is often code for a literal white-washing. The problems that existed in the neighbourhood – poverty, lack of opportunity, struggling populations denied city services – did not go away. They were simply priced out to a new location.
That new location is often an impoverished suburb, which lacks the glamour to make it the object of future renewal efforts. There is no history to attract preservationists because there is nothing in poor suburbs viewed as worth preserving, including the futures of the people forced to live in them. This is blight without beauty, ruin without romance: payday loan stores, dollar stores, unassuming homes and unpaid bills. In the suburbs, poverty looks banal and is overlooked.
In cities, gentrifiers have the political clout – and accompanying racial privilege – to reallocate resources and repair infrastructure. The neighbourhood is “cleaned up” through the removal of its residents. Gentrifiers can then bask in “urban life” – the storied history, the selective nostalgia, the carefully sprinkled grit – while avoiding responsibility to those they displaced.
Hipsters want rubble with guarantee of renewal. They want to move into a memory they have already made.
- At Mute, Dominic Pettman writes about the rise of MOOCs (Massively Open Online Courses) in higher education, and how commodification affects the value of learning:
In the pedagogic trenches, MOOCs are considered a symptom of wider economic patterns which effectively vacuum resources up into the financial stratosphere, leaving those doing the actual work with many more responsibilities, and far less compensation. Basic questions about the sustainability of this model remain unanswered, but it is clear that there is little room for enfranchised, full-time, fully-compensated faculty. Instead, we find an army of adjuncts servicing thousands of students; a situation which brings to mind scenes from Metropolis rather than Dead Poets Society.
For companies pushing MOOCs, education is no different from entertainment: it is simply a question of delivering ‘content.’ But learning to think exclusively via modem is like learning to dance by watching YouTube videos. You may get a sense of it, but no-one is there to point out mistakes, deepen your understanding, contextualise the gestures, shake up your default perspective, and facilitate the process. The role of the professor or instructor is not simply the shepherd for the transmission of information from point A to point B, but the co–forging of new types of knowledge, and critically testing these for various versions of soundness and feasibility. Wisdom may be eternal, but knowledge – both practical and theoretical – evolves over time, and especially exponentially in the last century, with all its accelerated technologies. Knowledge is always mediated, so we must consciously take the tools of mediation into account. Hence the need for a sensitive and responsive guide: someone students can bounce new notions off, rather than simply absorb information from. Without this element, distance learning all too often becomes distanced learning. Just as a class taken remotely usually leads to a sea of remote students.
Marshall McLuhan was half-right when he insisted that the electronic age is ushering in a post-literate society. But no matter how we like to talk of new audio-visual forms of literacy, there is still the ‘typographic man’ pulling the strings, encouraging us to express ourselves alphabetically. Indeed, the electronic and the literate are not mutually exclusive, much as people like to pit them against each other.
- Pettman also quotes Ian Bogost’s comments on distance learning:
The more we buy into the efficiency argument, the more we cede ground to the technolibertarians who believe that a fusion of business and technology will solve all ills. But then again, I think that’s what the proponents of MOOCs want anyway. The issue isn’t online education per se, it’s the logics and rationales that come along with certain implementations of it.