Medium’s Medium

In recognition of the impact of Covid-19 on campus instruction and the rise of unplanned distance learning, UC Press is pleased to make Representations and all of its online journals content free to all through June 2020.

The Medium Concept

by Anna Schechtman

In the second half of the twentieth century, in the very decades when the concept of “media” entered the vernacular, the “medium concept” began to shape American art criticism and curation. This was no coincidence: “mediums” emerged as a category for the organization and appreciation of art as the dialectical counterpart to media, and in response to the cultural imperialism of its mass-produced forms. As art became increasingly public, mediums became the public face of art.

The essay begins:

In 2006 New York’s Museum of Modern Art (MoMA) established the Department of Media, its first new curatorial department since 1971. A press release heralding the new department described MoMA as a “pioneer in the exhibition, acquisition and conservation of media art since the late 1960s.” It clarified that works of “media art” were “time- and sound-based,” as opposed to the mostly static and strictly visual works in MoMA’s departments of Architecture and Design, Drawings, Painting and Sculpture, Photography, and Prints and Illustrated Books. But there was, of course, another MoMA department that housed “time- and sound-based” art—the Department of Film—which, since 1935, had been collecting and conserving celluloid reels and digital projections from Hollywood and international film productions. Unlike those audiovisual works, however, “media art,” the release said, was “made for and presented in a gallery setting.” Works that met these criteria included performance art, video art, and installation art—all three of which, since their emergence in the late-1960s, have variously and often arbitrarily been called intermedia, multimedia, or mixed media art. Why, then, after forty years of collecting such work, did the museum finally institute a department of it in media’s name? The answer is partly functional: the conservation and display of installation and video art is technically different from that needed for oil-on-canvas or 16 mm celluloid works; a new department could better accommodate these differences. But it’s also epistemological and ideological: when media became a department at MoMA, it also became a medium—captured by the formalist logic of American art criticism after World War II and assimilated, however awkwardly, into the medium-specific organizational structure of the country’s premier modern art museum.

On the day of the new department’s founding, MoMA’s director Glenn D. Lowry confirmed that the newly appointed chief curator of media had “demonstrated a keen understanding of the dynamic potential of this medium.” This medium, that is, media. With this statement, Lowry not only instituted a linguistic paradox, he also dramatized—to the point of satire—a decades-old antinomy within the discipline of art history between mediums (specialized subcategories of “art,” including but not limited to those represented by MoMA’s departments) and media (an imbroglio of cultural production of little aesthetic value or political virtue). Since the 1960s, art critics and historians have been using the un-Latinate plural “mediums” to distinguish, and rhetorically elevate, works of art from the cultural chaos of “mass media,” “the media,” and, for that matter, the growing postwar trend of inter-, multi-, and mixed media works of art. The distinction rests on a precarious but potent conceit: that the history of modern art is autonomous, not only from the noxious commercialism of the culture industries, but also from the communication technologies that have supported them—especially, as the century progressed, from television, video, and digital images.

So mediums, not media. Rosalind Krauss was clearest about this mostly tacit disciplinary tic when she wrote, in a footnote to her 1999 manifesto Voyage on the North Sea: Art in the Age of the Post-Medium Condition: “Throughout this text I will use mediums as the plural of medium in order to avoid a confusion with media, which I am reserving for the technologies of communication indicated by that latter term.” Her distinction, however, confuses as it clarifies: after all, the “different mediums” represented in her essay include painting, sculpture, architecture, photography, and film—all of which could reasonably be called “technologies of communication” (and thus media, in her taxonomy) in different discursive fields. One could say, for example, that they are technologies that communicate images, aesthetic value, affective responses, spiritual transcendence, history, and, of course, capital. But it is precisely the field of art history—and, in particular, the modernist category of art that, as Krauss acknowledged, has attached aesthetic and political value to mediums in their specificity since the 1960s—that she was trying to protect from the corruption of media in Voyage on the North Sea:

At first I thought I could simply draw a line under the word medium, bury it like so much critical toxic waste, and walk away from it into a world of lexical freedom. “Medium” seemed too contaminated, too ideologically, too dogmatically, too discursively loaded . . . If I have decided in the end to retain the word “medium,” it is because for all the misunderstandings and abuses attached to it, this is the term that opens onto the discursive field I want to address.

Krauss was writing at the start of the new millennium, when a wave of installation art, apparently indifferent to the traditions of discrete mediums, crested onto the international art scene. The installation art trend was a product of what she called “the post-medium condition”—itself a time of formal confusion among the arts that she loosely traced back to the late 1960s. To eliminate “medium” from her critical lexicon would be to fall prey to this wave’s seductive undertow. Instead, Krauss parted the sea of contemporary art into the good and bad “post-medium” works. The good: those artists who “have embraced the idea of differential specificity, which is to say the medium as such, which they understand they will now have to reinvent or rearticulate.” And the bad: those artists, “now cut free from the guarantees of artistic tradition,” who “engage in the international fashion of installation art and intermedia work, in which art . . . finds itself complicit with a globalization of the image in the service of capital.” In other words, Krauss distinguished between media—its bastard forms, its subjugation to the market—and art that reinvents mediums for a critical discourse no longer trained to its logic.

Krauss was adapting modernist criteria to postmodern art production (the cognate of her subtitle and Lyotard’s The Postmodern Condition is dispositive here). And this, too, was MoMA’s project in 2006. By reinterpreting media as a medium and absorbing it into the disciplinary logic of the museum, Lowry didn’t so much embrace the post-medium condition at MoMA; he quarantined it. Not until 2019, after a full museum redesign, did the antiformalist logic of media reach his museum; not until then did medium release its hold on the presentation of modern and postmodern art. Indeed, the “New MoMA,” as that overhaul was billed, promised that “contemporary art will join early masterpieces, and we’ll mix mediums—from painting to performance—and ideas.” It’s a new MoMA, finally facing up to a condition that has defined the art world since the advent of mixed, multi- and intermedia art—since, in other words, the consolidation of the media concept itself.

The most ambitious account of that concept’s emergence is John Guillory’s 201“The Genesis of the Media Concept,” which sketches the two-thousand-year “latency of the media concept” from Aristotle to advertising. The concept’s latency, Guillory writes, was “superseded by the era of its ubiquity,” the age of media, or the mid-twentieth century to the present. Vast as it is, however, Guillory’s project does not probe this peculiar historical movement—this supersession—from latency to ubiquity. Instead, his essay takes as a central conceit the preoccupation of art and literary criticism by the concept of representation over and above the concept of media and its theoretical cognates mediation and communication. Like Krauss, Guillory devises a media concept that primarily describes processes and technologies of communication as distinct from “fine art.” (This is what allows him to suggest that literature has a “less conspicuously medial identity” than film.) Unlike Krauss, though, he doesn’t oppose the concept of media to that of artistic mediums; nor does he give the historical distinction hermeneutic value: the media concept’s relation to the fine arts, he suggests, has been unduly “repressed.” This “repression,” he writes, has “tacitly supported the disciplinary division between literary and media studies”—to which we could add the division between art history and media studies too.

But Guillory’s psychological metaphors (latency, repression) elide the material processes by which the media concept was consolidated in the vernacular in the twentieth century, shaping the discursive field in which fine arts were produced and received. Indeed, the “repression” of the “medial identity” of the fine arts is a mid-twentieth-century phenomenon, not unrelated to the sudden “ubiquity” of the media concept in that very period. In the same decades that the media concept entered the vernacular, the “medium concept” began to shape art criticism, art history, and museum studies. This was no coincidence: mediums emerged as a category for the organization and appreciation of art as the dialectical counterpart to media and in response to the cultural imperialism of its mass-produced forms. As art became increasingly “public”—available to popular audiences through the technical innovations of film and photography and the generic innovations of midcult, masscult, and kitsch—mediums became the public face of art. Continue reading …

ANNA SHECHTMAN is a PhD Candidate in English Literature and Film and Media Studies at Yale University and a Senior Humanities Editor at the Los Angeles Review of Books.

Sharon Marcus and Celebrity

A new book from Sharon Marcus, Columbia scholar and friend of the journal:

The Drama of Celebrity (Princeton)

In this a bold new account of how celebrity works, Marcus draws on scrapbooks, personal diaries, and vintage fan mail to trace celebrity culture back to its nineteenth-century roots, when people the world over found themselves captivated by celebrity chefs, bad-boy poets, and actors such as the “divine” Sarah Bernhardt (1844–1923), as famous in her day as the Beatles in theirs.

Sharon Marcus is the Orlando Harriman Professor of English and Comparative Literature at Columbia University. She is a founding editor of Public Books and the author of the award-winning Between Women: Friendship, Desire, and Marriage in Victorian England (Princeton) and Apartment Stories: City and Home in Nineteenth-Century Paris and London. She has co-edited two special issues for Representations, Description Across Disciplines (2016) and The Way We Read Now (2009)

“Boy, if life were only like this!”

“You Mean My Whole Fallacy Is Wrong”: On Technological Determinism

by John Durham Peters

The essay begins:

In Woody Allen’s romantic comedy Annie Hall (1977), the world’s most famous technological determinist had a brief cameo that in some circles is as well-known as the movie itself. Woody Allen, waiting with Diane Keaton in a slow-moving movie ticket line, pulls Marshall McLuhan from the woodwork to rebuke the blowhard in front of them, who is pontificating to his female companion about McLuhan’s ideas. McLuhan, as it happened, was not an easy actor to work with: even when playing a parody of himself, a role he had been practicing full-time for years, he couldn’t remember his lines, and when he could remember them, he couldn’t deliver them. In the final take (after more than fifteen tries), McLuhan tells the mansplainer, “I heard what you were saying. You, you know nothing of my work. You mean my whole fallacy is wrong. How you ever got to teach a course in anything is totally amazing.” In the film, the ability to call down ex cathedra authorities at will to silence annoying know-it-alls is treated as the ultimate in wish fulfillment as Allen says to the camera, “Boy, if life were only like this!” Rather than a knockout punch, however, McLuhan tells the man off with something that sounds like a Zen koan, an obscure private joke, or a Groucho Marx non sequitur. There is more going on here than a simple triumph over someone else’s intellectual error. Isn’t a fallacy always self-evidently wrong?

That a fallacy might not necessarily be wrong is the question I take up in this essay. Whatever technological determinism is, it is one of a family of pejoratives by which academics reprove their fellows for single-minded devotion (or monomaniacal fanaticism) to their pet cause. At least since “sophist” was launched as a slur in ancient Greece, it has been a regular sport to contrive doctrines that nobody believes and attribute them to one’s enemies. Terms ending with –ism serve this purpose particularly well. As Robert Proctor notes in an amusing and amply documented survey of academic nomenclature, “‘Bias’ and ‘distortion’ are perennial terms of derision from the center, and the authors of such slants are often accused of having fallen into the grip of some blinding ‘-ism.’” Often these -isms, he continues, “are things no one will openly claim to support: terrorism, dogmatism, nihilism, and so on.” (Racism and sexism are even better examples.) Terms ending with –ism, such as economism, fetishism, formalism, physicalism, positivism, and scientism, often stand for “zealotry or imprudence in the realm of method,” with reductionism standing for the whole lot. Corresponding nouns ending with –ist designate those people accused of harboring such doctrines—reductionist, fetishist, formalist—though –ist is a tricky particle. Artist, economist, psychologist, and above all, scientist have positive valences; artistic is a term of praise, but scientistic suggests being in the grip of an ideology. (It might be bad to be a positivist, but Trotskyist is strongly preferred to Trotskyite; it would be a big job to fully describe the whimsical behavior of the –ism clan, tasked as it is with policing ultrafine differences.) Pathologies such as logocentrism, phallogocentrism, and heteronormativity are often diagnosed in people who do not realize they are carriers.

Technological determinism belongs to this family of conceptual maladies thought unknown to their hosts but discernible by a savvy observer. It is one of a long line and large lexicon of academic insults and prohibitions. As old as academic inquiry is the conviction of the blindness of one’s fellow inquirers. From listening to the ways scholars talk about each other, you would not think they were a collection of unusually brainy people but rather a tribe uniquely susceptible to folly and stupidity. The cataloging of fallacies has been motivated by a desire to regulate (or mock) the thinking of the learned as much as of the crowd. The academy has been fertile soil for satirists from the ancient comic playwrights to Erasmus and Rabelais, from Swift and the Encyclopédie to Nietzsche to the postwar campus novel. Whatever else Renaissance humanism was, it was a critique of scholarly vices, and Erasmus’s Praise of Folly is, among other things, a compendium of still relevant witticisms about erudite errors. There are as many fallacies as chess openings, and the names of both index exotic, often long-forgotten historical situations. We shouldn’t miss the element of satire and parody in such cartoonish names as the red herring, bandwagon, card-stacking, and cherry-picking fallacies. “The Texas sharpshooter fallacy” is drawing the target after you have taken the shots. The “Barnum effect” describes the mistake of taking a trivially general statement as uniquely significant (as in fortune cookies or astrology readings). The study of fallacies gives you a comic, sometimes absurdist glance at the varieties of cognitive tomfoolery.

One reason why academic life is the native soil for the detection of fallacies is the great profit that can be wrung from strategic blindness. Looking at the world’s complexity through a single variable—air, fire, water, God, ideas, money, sex, genes, media—can be immensely illuminating. (Granting agencies smile on new “paradigms.”) A key move in the repertoire of academic truth games is noise-reduction. John Stuart Mill once noted of “one-eyed men” that “almost all rich veins of original and striking speculation have been opened by systematic half-thinkers.” A less single-minded Marx or Freud would not have been Marx or Freud. Intellectuals can be richly rewarded for their cultivated contortions.

But one man’s insight is another man’s blindness. The one-eyed gambit invites the counterattack of showing what has gone unseen, especially as prophetic vision hardens into priestly formula. Nothing quite whets the academic appetite like the opportunity to prove what dullards one’s fellows are, and, for good and ill, there never seems to be any shortage of material. (We all know people who think they can score points during Q&A sessions by asking why their favorite topic was not “mentioned.” Someone should invent a fallacy to name that practice.) At some point every scholar has felt the itch to clear the ground of previous conceptions and ill-founded methods; this is partly what “literature review” sections are supposed to do. (The history of the study of logic is littered with the remains of other people’s attempted cleanup jobs.) Scholars love to upbraid each other for being trapped by the spell of some nefarious influence. How great the pleasure in showing the folly of someone’s –istic ways! The annals of academic lore are full of tales of definitive takedowns and legendary mic drops, and social media platforms such as Facebook provide only the most recent germ culture for the viral spread of delicious exposés of the ignorant (as often political as academic). This is one reason McLuhan’s Annie Hall cameo continues to have such resonance: it is the archetype of a decisive unmasking of another scholar’s fraudulence or ignorance.

But it is also a classic fallacy: the appeal to authority. Who says McLuhan is the best explicator of his own ideas? As he liked to quip: “My work is very difficult: I don’t pretend to understand it myself.” You actually get a better sense of what McLuhan wrote from the blowhard, however charmlessly presented, than from McLuhan. The disagreeable truth is that what the man is doing isn’t really that awful or that unusual: it is standard academic behavior in the classroom at least, if not the movie line. Laughing at someone teaching a course on “TV, media, and culture” is, for many of us, not to recognize ourselves in the mirror. The fact that so many academics love the Annie Hall put-down is one more piece of evidence showing our vulnerability to fallacious modes of persuasion. Why should we delight in the silencing of a scholar by the gnomic utterances of a made-for-TV magus? Since when is silencing an academic value? And by someone who doesn’t really make any sense?

Silencing is one thing that the charge of technological determinism, like many other so-called fallacies, does. Fallacies need to be understood within economies and ecologies of academic exchange. They are not simply logical missteps. To accuse another of a fallacy is a speech act, a communicative transaction. The real danger of technological determinism may be its labeling as a fallacy. The accusation, writes Geoffrey Winthrop-Young, “frequently contains a whiff of moral indignation. To label someone a technodeterminist is a bit like saying that he enjoys strangling cute puppies; the depraved wickedness of the action renders further discussion unnecessary.” The threat of technological determinism, according to Wolf Kittler, “goes around like a curse frightening students.” The charge can conjure up a kind of instant consensus about what right-minded people would obviously avoid. The charge of technological determinism partakes of a kind of “filter bubble” logic of unexamined agreement that it’s either machines or people. Jill Lepore recently put it with some ferocity: “It’s a pernicious fallacy. To believe that change is driven by technology, when technology is driven by humans, renders force and power invisible.”

There are undeniably many vices and exaggerations around the concept of technology. But my overarching concern here is not to block the road of inquiry. (No-go zones often have the richest soil.) In a moment when the meaning of technics is indisputably one of the most essential questions facing our species, do we really want to make it an intellectual misdemeanor to ask big questions about “technology” and its historical role, however ill-defined the category is? What kinds of inquiry snap shut if we let the specter of technological determinism intimidate us? The abuse does not ruin the use. The question is particularly pointed for my home field of media studies, whose task is to show that form, delivery, and control, as well as storage, transmission, and processing, all matter profoundly. If explanations attentive to the shaping role of technological mediation are ruled out, the raison d’être of the field is jeopardized. It is so easy to sit at our Intel-powered computers and type our latest critique of technological determinism into Microsoft Word files while Googling facts and checking quotes online. We are so busy batting away the gnats of scholarly scruples to notice that we have swallowed a camel. Continue reading free of charge  …

This essay, a contribution to the special issue “Fallacies,” offers both a genealogy of the concept of technological determinism and a metacritique of the ways academic accusations of fallaciousness risk stopping difficult but essential kinds of inquiry. To call someone a technological determinist is to claim all the moral force on your side without answering the question of what we are to do with these devices that infest our lives.

JOHN DURHAM PETERS  is María Rosa Menocal Professor of English and Film and Media Studies at Yale and author of The Marvelous Clouds: Toward a Philosophy of Elemental Media (2015).

 

Sonic Meaning and Language Politics

Real-to-Reel: Social Indexicality, Sonic Materiality, and Literary Media Theory in Eduardo Costa’s Tape Works

tumblr_m5rhhvA3yL1r5yt7ko1_500

by Tom McEnaney

The essay begins:

In 1968, Vogue magazine featured an unusual new accessory. Ear (1966), a 24-karat gold anatomical replica that entirely covered model Marisa Berenson’s own ear, was one of a number of fitted extensions—there was also a finger, a toe, and strands of gold hair—that Argentine-born artist Eduardo Costa included in his Fashion Fiction 1. Photographed by Richard Avedon on one of Vogue’s most famous models, Costa’s jewelry—part sculpture, part ornamental prosthetic—attempted to parody the fashion industry even as it was absorbed into its pages. Playful and seductive, Ear wavered on the boundary—quickly eroding in 1968—between high-end fashion and vanguard art. At its most critical, Ear and other Fashion Fictions by Costa literalized the familiar reification of commodity culture: turning human body parts into objects, the works winked at fashion’s claim to be an extension of yourself. In repurposing the language of fashion, they also made sense in the Vogue of the late 1960s alongside the work of Andy Warhol, Roy Lichtenstein, Claes Oldenburg, and other artists. For, like these contemporaries in pop art or works from the Latin American neo-baroque, Costa’s ornaments reveled in the surface rather than condemning the superficial. This fascination with surfaces found an ideal corollary in Avedon’s photography, which celebrated the foreground. With Ear, Avedon’s portrait of Berenson became an almost mythic testament to the “statuesque” model, whose image recalls both a passing victim of Midas’s touch and a Galatea on the verge of breaking into the auditory world

If Ear stopped there, however, we could stack Costa’s Fashion Fictions alongside Oldenburg’s everyday objects or Warhol’s Brillo Boxes—all three artists shared work at the Fashion Show Poetry Event held at the Center for Inter-American Relations in New York in January of 1969. But Ear distinguishes itself from pop art standards not so much for its send-up of commodity culture, as through its emphasis on the auditory image. This sculpture, or ornament, or prosthetic shows what it doesn’t tell: sound is everywhere implicit but nowhere physically present in the work. Asking its viewers to look at listening, Ear transforms the apparently ephemeral world of sound into a physical object.

This objectification of sound, whose effect on the wearer, it’s worth remembering, would be to mute or dull audition, ties in to the revolution in materializing sound in the 1960s. Like our own moment’s explosion of new technologies and formats for producing and consuming sound, postwar innovations in audio engineering, largely linked to the emergence of newly popular recording materials such as magnetic tape, renewed older concerns about fidelity and the realism of reproduced sound. Yet, notably different from most current criticism of digital sound’s apparent loss of fidelity, the 1960s technologies helped produce the cult of high fidelity, renewing nineteenth-century discourses of sonic fidelity and the belief that sound reproduction could become indistinguishable from the recorded source.

As I will explain in greater detail in what follows, Costa’s work at this time went beyond sculpture and concept to draw from new sound recording technologies’ ability to register and (re)produce sonic phenomena, and to bind these transformations to language and literature. In terms familiar to media studies, just as photography or film’s chemical imprint of the sun’s rays onto photographic negatives indexed physical traces of light, high fidelity seemed to expand what Friedrich Kittler would celebrate as the gramophone’s ability to inscribe the material “real” of sonic vibrations onto cylinders or shellac discs. Yet, while Kittler declared that electrical sound recording tolled the death knell of literature, Costa’s tape recording work in the late 1960s fuses the material index of media studies with what linguistic anthropologist Michael Silverstein calls the “non-referential social indexicality” available in language. Such social indexicality exists, for example, in the sonic attributes of a voice that can index a speaker’s age, nationality, sex, and so on. Against what has often been understood as the impasse between literature and media in the wake of Kittler, Costa brings together these two sides of the index to create a literary media theory and practice based in sound recording. Continue reading …

This article develops a linguistic media theory that brings together Peircean materialist indexicality from Barthes, Bazin, Doane, Krauss, and others with linguistic anthropologist Michael Silverstein’s nonreferential (social) indexicality. Following Argentine sound artist Eduardo Costa’s practice with tape recording, the article challenges critical theory to account for the sonic meaning at play in pragmatic (nonsemantic) communication related to gender, race, and diasporic community. More than a mere supplement or limit, material sonic media expand aesthetic representation, and media archaeology opens new possibilities to intervene in language politics.

thumbnail_Tom-McEnaney+Faculty+PhotoTOM McENANEY is Assistant Professor of Comparative Literature at Cornell University. He is the author of several articles and the forthcoming book Acoustic Properties: Radio, Narrative, and the New Neighborhood of the Americas (Flashpoints Series, Northwestern University Press, 2017).

Fiery Cinema

Fiery Cinema: The Emergence of an Affective Medium in China, 1915-1945

by Weihong Bao

Berkeley Book Chats

Wednesday, Oct 19, 2016 | 12:00 pm to 1:00 pm
Geballe Room, 220 Stephens Hall, UC Berkeley

Mapping the changing identity of cinema in China in relation to Republican-era print media, theatrical performance, radio broadcasting, television, and architecture, in Fiery Cinema Weihong Bao constructs an archaeology of Chinese media culture. She grounds the question of spectatorial affect and media technology in China’s experience of mechanized warfare, colonial modernity, and the shaping of the public into consumers, national citizens, and a revolutionary collective subject.Bao_large

A major contribution to the theory and history of media, Fiery Cinema rethinks the nexus of affect and medium to offer key insights into the relationship of cinema to the public sphere and the making of the masses.

Weihong Bao is associate professor of East Asian Languages and Cultures and Film and Media at UC Berkeley. Her short essay “From Duration to Temporization: Rethinking Time and Space for Durational Art” will appear in the fall issue of Representations (the special issue “Time Zones: Durational Art and Its Contexts”), available next month.

After an introduction by Andrew Jones (East Asian Languages and Cultures, UC Berkeley, and Representations editorial board member), Bao will speak briefly about her work and then open the floor for discussion.