by John Durham Peters
The essay begins:
In Woody Allen’s romantic comedy Annie Hall (1977), the world’s most famous technological determinist had a brief cameo that in some circles is as well-known as the movie itself. Woody Allen, waiting with Diane Keaton in a slow-moving movie ticket line, pulls Marshall McLuhan from the woodwork to rebuke the blowhard in front of them, who is pontificating to his female companion about McLuhan’s ideas. McLuhan, as it happened, was not an easy actor to work with: even when playing a parody of himself, a role he had been practicing full-time for years, he couldn’t remember his lines, and when he could remember them, he couldn’t deliver them. In the final take (after more than fifteen tries), McLuhan tells the mansplainer, “I heard what you were saying. You, you know nothing of my work. You mean my whole fallacy is wrong. How you ever got to teach a course in anything is totally amazing.” In the film, the ability to call down ex cathedra authorities at will to silence annoying know-it-alls is treated as the ultimate in wish fulfillment as Allen says to the camera, “Boy, if life were only like this!” Rather than a knockout punch, however, McLuhan tells the man off with something that sounds like a Zen koan, an obscure private joke, or a Groucho Marx non sequitur. There is more going on here than a simple triumph over someone else’s intellectual error. Isn’t a fallacy always self-evidently wrong?
That a fallacy might not necessarily be wrong is the question I take up in this essay. Whatever technological determinism is, it is one of a family of pejoratives by which academics reprove their fellows for single-minded devotion (or monomaniacal fanaticism) to their pet cause. At least since “sophist” was launched as a slur in ancient Greece, it has been a regular sport to contrive doctrines that nobody believes and attribute them to one’s enemies. Terms ending with –ism serve this purpose particularly well. As Robert Proctor notes in an amusing and amply documented survey of academic nomenclature, “‘Bias’ and ‘distortion’ are perennial terms of derision from the center, and the authors of such slants are often accused of having fallen into the grip of some blinding ‘-ism.’” Often these -isms, he continues, “are things no one will openly claim to support: terrorism, dogmatism, nihilism, and so on.” (Racism and sexism are even better examples.) Terms ending with –ism, such as economism, fetishism, formalism, physicalism, positivism, and scientism, often stand for “zealotry or imprudence in the realm of method,” with reductionism standing for the whole lot. Corresponding nouns ending with –ist designate those people accused of harboring such doctrines—reductionist, fetishist, formalist—though –ist is a tricky particle. Artist, economist, psychologist, and above all, scientist have positive valences; artistic is a term of praise, but scientistic suggests being in the grip of an ideology. (It might be bad to be a positivist, but Trotskyist is strongly preferred to Trotskyite; it would be a big job to fully describe the whimsical behavior of the –ism clan, tasked as it is with policing ultrafine differences.) Pathologies such as logocentrism, phallogocentrism, and heteronormativity are often diagnosed in people who do not realize they are carriers.
Technological determinism belongs to this family of conceptual maladies thought unknown to their hosts but discernible by a savvy observer. It is one of a long line and large lexicon of academic insults and prohibitions. As old as academic inquiry is the conviction of the blindness of one’s fellow inquirers. From listening to the ways scholars talk about each other, you would not think they were a collection of unusually brainy people but rather a tribe uniquely susceptible to folly and stupidity. The cataloging of fallacies has been motivated by a desire to regulate (or mock) the thinking of the learned as much as of the crowd. The academy has been fertile soil for satirists from the ancient comic playwrights to Erasmus and Rabelais, from Swift and the Encyclopédie to Nietzsche to the postwar campus novel. Whatever else Renaissance humanism was, it was a critique of scholarly vices, and Erasmus’s Praise of Folly is, among other things, a compendium of still relevant witticisms about erudite errors. There are as many fallacies as chess openings, and the names of both index exotic, often long-forgotten historical situations. We shouldn’t miss the element of satire and parody in such cartoonish names as the red herring, bandwagon, card-stacking, and cherry-picking fallacies. “The Texas sharpshooter fallacy” is drawing the target after you have taken the shots. The “Barnum effect” describes the mistake of taking a trivially general statement as uniquely significant (as in fortune cookies or astrology readings). The study of fallacies gives you a comic, sometimes absurdist glance at the varieties of cognitive tomfoolery.
One reason why academic life is the native soil for the detection of fallacies is the great profit that can be wrung from strategic blindness. Looking at the world’s complexity through a single variable—air, fire, water, God, ideas, money, sex, genes, media—can be immensely illuminating. (Granting agencies smile on new “paradigms.”) A key move in the repertoire of academic truth games is noise-reduction. John Stuart Mill once noted of “one-eyed men” that “almost all rich veins of original and striking speculation have been opened by systematic half-thinkers.” A less single-minded Marx or Freud would not have been Marx or Freud. Intellectuals can be richly rewarded for their cultivated contortions.
But one man’s insight is another man’s blindness. The one-eyed gambit invites the counterattack of showing what has gone unseen, especially as prophetic vision hardens into priestly formula. Nothing quite whets the academic appetite like the opportunity to prove what dullards one’s fellows are, and, for good and ill, there never seems to be any shortage of material. (We all know people who think they can score points during Q&A sessions by asking why their favorite topic was not “mentioned.” Someone should invent a fallacy to name that practice.) At some point every scholar has felt the itch to clear the ground of previous conceptions and ill-founded methods; this is partly what “literature review” sections are supposed to do. (The history of the study of logic is littered with the remains of other people’s attempted cleanup jobs.) Scholars love to upbraid each other for being trapped by the spell of some nefarious influence. How great the pleasure in showing the folly of someone’s –istic ways! The annals of academic lore are full of tales of definitive takedowns and legendary mic drops, and social media platforms such as Facebook provide only the most recent germ culture for the viral spread of delicious exposés of the ignorant (as often political as academic). This is one reason McLuhan’s Annie Hall cameo continues to have such resonance: it is the archetype of a decisive unmasking of another scholar’s fraudulence or ignorance.
But it is also a classic fallacy: the appeal to authority. Who says McLuhan is the best explicator of his own ideas? As he liked to quip: “My work is very difficult: I don’t pretend to understand it myself.” You actually get a better sense of what McLuhan wrote from the blowhard, however charmlessly presented, than from McLuhan. The disagreeable truth is that what the man is doing isn’t really that awful or that unusual: it is standard academic behavior in the classroom at least, if not the movie line. Laughing at someone teaching a course on “TV, media, and culture” is, for many of us, not to recognize ourselves in the mirror. The fact that so many academics love the Annie Hall put-down is one more piece of evidence showing our vulnerability to fallacious modes of persuasion. Why should we delight in the silencing of a scholar by the gnomic utterances of a made-for-TV magus? Since when is silencing an academic value? And by someone who doesn’t really make any sense?
Silencing is one thing that the charge of technological determinism, like many other so-called fallacies, does. Fallacies need to be understood within economies and ecologies of academic exchange. They are not simply logical missteps. To accuse another of a fallacy is a speech act, a communicative transaction. The real danger of technological determinism may be its labeling as a fallacy. The accusation, writes Geoffrey Winthrop-Young, “frequently contains a whiff of moral indignation. To label someone a technodeterminist is a bit like saying that he enjoys strangling cute puppies; the depraved wickedness of the action renders further discussion unnecessary.” The threat of technological determinism, according to Wolf Kittler, “goes around like a curse frightening students.” The charge can conjure up a kind of instant consensus about what right-minded people would obviously avoid. The charge of technological determinism partakes of a kind of “filter bubble” logic of unexamined agreement that it’s either machines or people. Jill Lepore recently put it with some ferocity: “It’s a pernicious fallacy. To believe that change is driven by technology, when technology is driven by humans, renders force and power invisible.”
There are undeniably many vices and exaggerations around the concept of technology. But my overarching concern here is not to block the road of inquiry. (No-go zones often have the richest soil.) In a moment when the meaning of technics is indisputably one of the most essential questions facing our species, do we really want to make it an intellectual misdemeanor to ask big questions about “technology” and its historical role, however ill-defined the category is? What kinds of inquiry snap shut if we let the specter of technological determinism intimidate us? The abuse does not ruin the use. The question is particularly pointed for my home field of media studies, whose task is to show that form, delivery, and control, as well as storage, transmission, and processing, all matter profoundly. If explanations attentive to the shaping role of technological mediation are ruled out, the raison d’être of the field is jeopardized. It is so easy to sit at our Intel-powered computers and type our latest critique of technological determinism into Microsoft Word files while Googling facts and checking quotes online. We are so busy batting away the gnats of scholarly scruples to notice that we have swallowed a camel. Continue reading free of charge …
This essay, a contribution to the special issue “Fallacies,” offers both a genealogy of the concept of technological determinism and a metacritique of the ways academic accusations of fallaciousness risk stopping difficult but essential kinds of inquiry. To call someone a technological determinist is to claim all the moral force on your side without answering the question of what we are to do with these devices that infest our lives.
JOHN DURHAM PETERS is María Rosa Menocal Professor of English and Film and Media Studies at Yale and author of The Marvelous Clouds: Toward a Philosophy of Elemental Media (2015).
Real-to-Reel: Social Indexicality, Sonic Materiality, and Literary Media Theory in Eduardo Costa’s Tape Works
by Tom McEnaney
The essay begins:
In 1968, Vogue magazine featured an unusual new accessory. Ear (1966), a 24-karat gold anatomical replica that entirely covered model Marisa Berenson’s own ear, was one of a number of fitted extensions—there was also a finger, a toe, and strands of gold hair—that Argentine-born artist Eduardo Costa included in his Fashion Fiction 1. Photographed by Richard Avedon on one of Vogue’s most famous models, Costa’s jewelry—part sculpture, part ornamental prosthetic—attempted to parody the fashion industry even as it was absorbed into its pages. Playful and seductive, Ear wavered on the boundary—quickly eroding in 1968—between high-end fashion and vanguard art. At its most critical, Ear and other Fashion Fictions by Costa literalized the familiar reification of commodity culture: turning human body parts into objects, the works winked at fashion’s claim to be an extension of yourself. In repurposing the language of fashion, they also made sense in the Vogue of the late 1960s alongside the work of Andy Warhol, Roy Lichtenstein, Claes Oldenburg, and other artists. For, like these contemporaries in pop art or works from the Latin American neo-baroque, Costa’s ornaments reveled in the surface rather than condemning the superficial. This fascination with surfaces found an ideal corollary in Avedon’s photography, which celebrated the foreground. With Ear, Avedon’s portrait of Berenson became an almost mythic testament to the “statuesque” model, whose image recalls both a passing victim of Midas’s touch and a Galatea on the verge of breaking into the auditory world
If Ear stopped there, however, we could stack Costa’s Fashion Fictions alongside Oldenburg’s everyday objects or Warhol’s Brillo Boxes—all three artists shared work at the Fashion Show Poetry Event held at the Center for Inter-American Relations in New York in January of 1969. But Ear distinguishes itself from pop art standards not so much for its send-up of commodity culture, as through its emphasis on the auditory image. This sculpture, or ornament, or prosthetic shows what it doesn’t tell: sound is everywhere implicit but nowhere physically present in the work. Asking its viewers to look at listening, Ear transforms the apparently ephemeral world of sound into a physical object.
This objectification of sound, whose effect on the wearer, it’s worth remembering, would be to mute or dull audition, ties in to the revolution in materializing sound in the 1960s. Like our own moment’s explosion of new technologies and formats for producing and consuming sound, postwar innovations in audio engineering, largely linked to the emergence of newly popular recording materials such as magnetic tape, renewed older concerns about fidelity and the realism of reproduced sound. Yet, notably different from most current criticism of digital sound’s apparent loss of fidelity, the 1960s technologies helped produce the cult of high fidelity, renewing nineteenth-century discourses of sonic fidelity and the belief that sound reproduction could become indistinguishable from the recorded source.
As I will explain in greater detail in what follows, Costa’s work at this time went beyond sculpture and concept to draw from new sound recording technologies’ ability to register and (re)produce sonic phenomena, and to bind these transformations to language and literature. In terms familiar to media studies, just as photography or film’s chemical imprint of the sun’s rays onto photographic negatives indexed physical traces of light, high fidelity seemed to expand what Friedrich Kittler would celebrate as the gramophone’s ability to inscribe the material “real” of sonic vibrations onto cylinders or shellac discs. Yet, while Kittler declared that electrical sound recording tolled the death knell of literature, Costa’s tape recording work in the late 1960s fuses the material index of media studies with what linguistic anthropologist Michael Silverstein calls the “non-referential social indexicality” available in language. Such social indexicality exists, for example, in the sonic attributes of a voice that can index a speaker’s age, nationality, sex, and so on. Against what has often been understood as the impasse between literature and media in the wake of Kittler, Costa brings together these two sides of the index to create a literary media theory and practice based in sound recording. Continue reading …
This article develops a linguistic media theory that brings together Peircean materialist indexicality from Barthes, Bazin, Doane, Krauss, and others with linguistic anthropologist Michael Silverstein’s nonreferential (social) indexicality. Following Argentine sound artist Eduardo Costa’s practice with tape recording, the article challenges critical theory to account for the sonic meaning at play in pragmatic (nonsemantic) communication related to gender, race, and diasporic community. More than a mere supplement or limit, material sonic media expand aesthetic representation, and media archaeology opens new possibilities to intervene in language politics.
TOM McENANEY is Assistant Professor of Comparative Literature at Cornell University. He is the author of several articles and the forthcoming book Acoustic Properties: Radio, Narrative, and the New Neighborhood of the Americas (Flashpoints Series, Northwestern University Press, 2017).
by Weihong Bao
Wednesday, Oct 19, 2016 | 12:00 pm to 1:00 pm
Geballe Room, 220 Stephens Hall, UC Berkeley
Mapping the changing identity of cinema in China in relation to Republican-era print media, theatrical performance, radio broadcasting, television, and architecture, in Fiery Cinema Weihong Bao constructs an archaeology of Chinese media culture. She grounds the question of spectatorial affect and media technology in China’s experience of mechanized warfare, colonial modernity, and the shaping of the public into consumers, national citizens, and a revolutionary collective subject.
A major contribution to the theory and history of media, Fiery Cinema rethinks the nexus of affect and medium to offer key insights into the relationship of cinema to the public sphere and the making of the masses.
Weihong Bao is associate professor of East Asian Languages and Cultures and Film and Media at UC Berkeley. Her short essay “From Duration to Temporization: Rethinking Time and Space for Durational Art” will appear in the fall issue of Representations (the special issue “Time Zones: Durational Art and Its Contexts”), available next month.
After an introduction by Andrew Jones (East Asian Languages and Cultures, UC Berkeley, and Representations editorial board member), Bao will speak briefly about her work and then open the floor for discussion.