Annual Conference 2024                                           Donate Now
Join Now      Sign In

CAA News Today

ROBERT STORR CONVOCATION ADDRESS

posted by Christopher Howard — Sep 17, 2013

The following is an edited and revised version of the transcript of a talk given at the 2013 CAA Annual Conference in New York. The author has taken the liberty of including comments contained in his original notes for the talk that were left unspoken as well as that of smoothing out certain passages and eliminating repetitions and digressions in others.

The Art World We’ve Made, the Communities We Belong to, the Language We Use, and the Work We Have Yet to Do

Robert Storr delivers the Covocation address at the 2013 CAA Annual Confernece in New York (photograph by Bradley Marks)

Occasions of this kind are very strange. Hal Foster was speaking earlier about triangulation, and the number of triangles I can draw in this room is kind of disconcerting, starting with the fact that Hal was once my editor at Art in America. Some of the other, more consequential ones will be touched on by what I have to say.

In my office at Yale I have a framed copy of Mad Magazine that once I used in an exhibition at SITE Santa Fe. The particular strip—it’s a two-page spread—reproduces a hypothetical scene from The Lone Ranger in which the Lone Ranger and Tonto are surrounded by a band of warlike Indians. The Lone Ranger says, “Looks like we’re in trouble, Tonto.” To which his sidekick replies, “What do you mean, we, kemosabe?”

That’s Postmodernism avant la lettre. Low stealing a critical march on High. There, in a nutshell, you have the problem of the pronoun before Barbara Kruger started to mine that field and plant more explosive charges. It is an example of how many of the thoughts that we take for granted as somehow being accomplishments of Postmodernism have actually been blowing in the wind for a long time—as part of continuously developing dialogue about the obvious but habitually ignored contradictions that are embedded in our language, culture, and society. If you want to delve deeper into such things, allow me to note that as much of an admirer of Roland Barthes as I am—and I have learned a great deal from him and honor him for what he has given me—I am an equally great admirer of Oscar Wilde. His views on “The Critic as Artist” and “The Artist as Critic” were set forth seventy or eighty years in advance Barthes’s, and Wilde’s bantering practice of negative dialectic is far more entertaining and effective than the scholastic versions of those same ideas that pervade classrooms and conferences nowadays.

However, in keeping with critical theory according to Mad, I will be cautious in my use of collective terminology or pronouns. And so I will begin this brief exercise in intramural deconstruction with the question “Who’s ‘we’?”—or, better perhaps, “Whose ‘we’?”—precisely to raise the question of whether “you” want to count yourself in or count yourself out with respect to what I have to say. In the simplest terms, what I mean by the “art world” is the combination of professional and semiprofessional realities and relationships amidst which we—and I do mean all of us—do our work as artists, critics, curators, scholars, teachers, and administrators. Some of us do all those things together. Some of us do them separately or sequentially. The lucky ones are those who only have to do one at a time, which, like most lucky things, is rare. By “art community,” I want to point toward a set of elective affinities that we establish with others who are similarly dedicated to art without regard to anyone’s official designation or career status.

Among the things that strongly attracted me to the art world when I first arrived on the scene—my initial exposure to New York dates to the late 1960s and was renewed in the late 1970s and early 1980s—was that you could meet virtually anybody just by showing up. At one loft party in 1968 I met Claes and Patty Oldenburg, Lee Krasner, Christo and Jean-Claude, George Segal, and Jasper Johns. At another in 1978 I met Ana Mendieta and Carl Andre, and at another in 1980s—if memory serves—Nancy Spero and Leon Golub, David Wojnarowicz, Julie Ault, and Irving Petlin. The distinctions among people were not hard and fast, and the hierarchies in which they operated were surprisingly fluid in many though by no means all ways. Of course glass ceilings were everywhere, but on the whole people didn’t care a whole lot about who you were “officially”—they cared about what kind of energy you brought to the occasion. Bringing energy was your passport to being part of the art community because despite the constant commotion, the art world never has enough. Certainly never enough of the positive kinds.

For what it is worth, I’ve held professional positions in all the categories named above. But like many others here in this room, and many of the people that I have worked with over the years, I’ve also made my living at a host of jobs that have absolutely no standing whatsoever. Things like Sheetrocking, art handling, catering, and the like. For many years, I was a librarian and bookseller. Most of my education comes from the things I read while hiding in the stacks, browsing the shelves when floor-walking to prevent theft, or tending the cash register with one eye out for customers and one on the book tucked in my lap. By now I have worked at just about every level of what we routinely call the Culture Industry.

Now, I have always been very leery of that term—Culture Industry. Mostly I dislike it when it is used with glib pessimism or thinly veiled condescension by people who have never worked for any other industry. People who, adding insult to injury, persist in thinking that they are the saving remnant of unalienated labor and uncompromised criticality exempt from the challenges and delusions facing the rest of us who toil in ignorance for the sake making a living, if not in some state of schizophrenic Late Capitalist delirium. I’ve always assumed that working for a living was just something you had to do; it’s how you paid for the doing the work you wanted to do. Furthermore, I believed—and still believe—that well-chosen work that takes skill and requires ingenuity but isn’t too time consuming was preferable to an exalted career that was stultifying and time wasting. In sum, that an honest day’s work for an honest day’s pay at a blue-collar or a white-collar job is pretty much the same thing except for the wages and benefits. Which is to say, it is capitalist business as usual in a society where—alas—there is no socialist alternative. There’s no Ivory Tower either, just an Ivy Covered Mall.

Of course, the truth of the matter is that there is such a thing as the Culture Industry. We all know it, and one way or another all of us are, perforce, “cultural industrialists.” Some of us are in positions of authority; others work for others those who command and are delegated power by them. None of us is immune from the compromises inherent in such affiliations—and none of us is entirely innocent of the offenses committed by the Culture Industry as a whole. Standing altogether outside it is not something that any of us can plausibly claim to be doing, not if we play an active role in the systems that “produce” culture and especially not if we enjoy the perquisites of that system. Correspondingly, one appreciates the candor of people who frankly admit that they are inside it, people who identify as best they can their actual place in the system and are straightforward about the tradeoffs they have made to arrive in it and make the best of their advantages. People, moreover, who try to be as honest as possible in judging others with respect to the problems and privileges they too have chosen to make their own and the charges that might conceivably be leveled against them as well.

Now, when I say “we” with regard to having made the art world what it has become, I mean my generation, more or less: the baby boomers and their younger siblings. We have come of age rising to positions of power that we once vigorously protested. Having done so, we are subject and should be subject to protest. With that in mind I keep a porcelain statuette from the Chinese Cultural Revolution nearby, a statuette of a professor on his knees wearing a dunce cap with an righteously accusatory member of the Red Guard standing behind him. In reality the mass hysteria of the Little Red Book of which I caught a distant glimpse in those days was not a laughing matter. But gallows humor has its merits, and I know that such a comeuppance remains a possibility—because we have in fact become the “figures of authority” we once considered our perpetual enemies. The jury is still out on whether we have also become the “ciphers of regression” we took them to be, but inasmuch as it is supposed to be a jury of our peers we should be careful that it is not packed in ways that make self-criticism a pro forma ritual and all other judgments those of an ideologically dogmatic kangaroo court. In any event there’s no doubt that the outstanding question is: “Have we actually done better at playing our part in a system we have chosen to enter than our predecessors did, or have we failed to do better than they did or even as well as we could.”

Those who are much younger than I have ample reason to dissociate themselves from this part of what I have to say, and to complain that they have inherited a situation for which they should not be held responsible. Indeed they are not responsible. In fact, that situation may have scant place for them at all, either in the near term or the long term, and even then only room at the very top or very bottom and in the margins but little if any at the core. It’s an art world that was built during a time of enormous prosperity—and also of cultural myopia and self-centeredness. The important thing we must come so terms with—and here again I am speaking of all of us, young and old, inside as well as outside the magic circle—is that the old art world has run its course. It’s over, a thing of the past, and all of us know it. Whatever comes next will not be another iteration of the “postwar modernist” or “postwar postmodernist” model of an art world, nor an improvement on it extrapolated from it. It will, and must be, fundamentally different. In almost all cases in which the arc of a great empire reaches its apogee and turns downward—which is not the same thing as beginning a long inevitable slide into decadence, although such definitive decline is always a possibility—the diminution of momentum and influence that results rattles many and affects absolutely everyone. In “our” own art world we are experiencing such a diminution of growth and influence—and in many locales a sharp contraction of opportunity in relation to the supply of aspirants—even as we are simultaneously witnessing the unanticipated proliferation, expansion, and diversification of other art worlds.

One of the best tests of how serious people are in their cultural politics is whether, under these circumstances, they cling to their privileges—claiming that they, among all people, are entitled to them, whereas those other people are not—or whether they begin to reexamine those privileges and begin to consider what new social contracts need to be written, what new professional standards need to be set and observed.

As I indicated at the outset, there are a lot of problems with Adorno’s way of characterizing the Culture Industry and those who labor in it, partly because he, as a scholar and semiprofessional composer, was immensely uncomfortable with the fact that others made their money from their cultural production. He didn’t like jazz because he saw it as being a form entirely subordinate to the demand for commercial entertainment. But aside from being inexplicably incurious and insensitive—not to say knuckleheaded and tone deaf—not liking jazz in the 1930s and 1940s was in one way or another enmeshed in cultural prejudice against the people who made it—mostly blacks. In that respect avant-garde “purism” wasn’t all that much better than the reactionary populism of the period. In sum Adorno’s conception of music and of modernism generally could not have been more elitist in its fashion, even though the cause in which it was ostensibly being put forward was Marxism. Looking backward, the whole situation makes one wonder how Robert Farris Thompson would have handled Adorno’s condescension toward Afrocentric genres. For starters, perhaps, by dancing circles around his rhythmically challenged professorial counterpart. Meanwhile, looking backward and forward as well as at the present, we know that few if any of Adorno’s disciples in the art-historical guilds have written extensively—if at all—about African or African American art.

Such biases and blind spots are prevalent and problematic in much of the cultural heritage of the Frankfurt School, which, for all its positive contributions, was undeniably Eurocentric to a fault. We should be grateful for those who did not propagate or suffer such ingrained attitudes, but we should never forget that a good deal of critical theory was tainted by them—and still is. Meanwhile, Adorno’s “jazz problem” is a class problem as well as an ethnic one. Like many academicians, he implicitly as well as explicitly looked down on those who made art to sell art, and his complaint against jazz bespoke that contempt. Yet artists—at least those who have galleries—understand that if they are to continue to develop their work, they must find a market for it. And they must maintain that market in all the ways required by their industry. I don’t scorn those who have a salaried teaching job—I have one too—but it is all too easy for somebody with such a guaranteed income to “tut-tut-tut” about people who work with their hands as well as their minds, all too easy to bemoan the supposed commercialism of those in other economic sectors in which deals are made—as if professors in a position to do so don’t negotiate hard for salaries and benefits—and to scold people who are unapologetically merchants and actively “move the merch”—as if scholars didn’t keep a fairly exact count of their number of publications.

Unfortunately, though, it happens a lot. The awkward fact of the matter is that art has always sold for money. Art has always been involved in social strivings. Art has always been involved in patronage. Not long ago I was in India, where I visited the caves at Elephanta, and a guide pointed out to me that underneath one of the transcendent carved Hindu figures was an inscription naming the person who paid for it. Of course in Renaissance paintings we frequently see donors on their hands and knees in front of crowded nativity scenes, donors put there at their own request by the artist in order to be remembered for having paid for the picture. Still, the cry of “Oh my God, there’s gambling going on in Rick’s Café” echoes throughout an awful lot of cultural criticism, and it tells us that those who have just made this shocking discovery—or have suddenly chosen to be theatrically shocked by something they noticed a while back—have not been thinking historically about the fact that there’s always been gambling going on in Rick’s Café. The pertinent questions are: “ Who owns the casino? Who’s playing? What’s the ante to get into the game, and what are the stakes if you want to stay? Are the tables level? Are the cards stacked? Who wins? Who loses? What do those who walk away with the pot do with it?” In this context, one is better off reading Fyodor Dostoyevsky or tough-guy detective stories than Karl Marx.

In any case, when I started out I was naïve to the extent that I did not have a clear idea about what I wanted or how I was going to survive. Like many of you, I had my studio practice. And, like many of you, I also had other things on the side. To keep moving toward my vague goals and make ends meet while I sharpened my focus I lived “betwixt and between” for a very long time. I was thirty-nine years old before I got my first tenure-track teaching job. I was forty before I ever worked in a museum. Up until this time, everything I managed to do was catch-as-catch-can. Now many if not most of my students, be they artists or art historians, find themselves in a similar situation but in a very different world. Because things aren’t going to break for them the way they used to—they way they did for me. Many of those who teach are essentially destined to be eternal adjunct professors. My sister, who is a social historian, was an adjunct professor at a university in Canada until she was in her late forties, and only got a tenure-track job and finally tenure past the age when most people would have thrown in the towel. There are many others in our fields who have been in her situation and who never succeed in finding any degree of security or recognition.

Robert Storr (photograph by Bradley Marks)

Nevertheless when I started out knowing what I wanted to do in general terms but not knowing how I was going to support myself, I did not have great expectations. I certainly never imagined that I would ever live in a lavish way, not that I do even now. I’m utterly surprised that I have had the jobs that I’ve had and the financial security that I have. (Many friends do not.) Maybe some of the people who gave them to me are surprised, too! In any case, I am grateful for my situation and have never felt that it was owed to me. What I aspired to was to spend my life, or as much of it as I could manage, doing the thing that interested me most. And, as I stated before, in support of that ambition I planned to do work that took the least away from me and gave the most back to my primary concern. That was the prospect in the 1970s. It seemed to be a reasonable way to go about things.

Having said all of that, I am now in a world—we are now in a world—and by “we” I mean all of us in this room—in which that way of going at things—simply launching yourself based on a series of part-time jobs in order to do in your spare time what you want to do—is the longest of long shots and may be all but impossible. We all know that one can no longer support oneself—much less other people if one has them in one’s life—on part-time jobs. The generation of postwar artists to which this year’s awardee Ellsworth Kelly belongs could rent a studio for $100 or $200 a month and get a lot of space. And they could pay for it by carpentry, art handling, bartending, paste-up, and other such things—and still have three, four, or even five days a week left over for themselves. Those days are behind us.

The same thing is true for people entering the Culture Industry on the art-historical side of the equation, as academics, museum workers, and so on. If you’re going to teach part time at three, four, or even five different places as I once did—one course here, one course there, up and down the Eastern Seaboard, back and forth from the Midwest to the West—in most cases you still can’t make enough to survive decently for very long and, given the stress of such employment, you won’t be able to produce much if any serious scholarly work. So we’ve reached a point in which even the most intense drive to do something smacks right up against the possibility that one may never be able to do it at all, or not very well, throughout an entire “career,” much less follow a career path that proceeds in a rational, predictable fashion from one stage to the next in the manner Allan Kaprow more or less predicted would become norm for artists in the 1960s when he wrote his brilliant polemic against the old bohemia, “The Artist as a Man of the World.”

Times are not good. They have not been good for quite some while—and they’re not going to get better any time soon, if ever. The dire budget cuts currently taking place in the California educational system were mentioned earlier today—Kaprow taught at UC San Diego for many years—and California was the national model for the support of teaching the arts in state colleges and universities. So that situation is an ominous bellwether of what is to come in more vulnerable systems. I work at Yale University, which is one of the richest universities in this country, and I can tell you that as the dean of the School of Arts I have had to impose sharp budget cuts for five years running. So far we have been able to protect student aid and faculty positions and salaries. Everything else—down to pencil counting, literally—has seen reductions. It’s not going to get any better for quite a while. We’re making a concerted effort to raise money and with some success we’ll try to ride out the storm with as little damage as possible. But the days one could take resources more or less for granted and ignore the looming long-term future of the institutions in which we work are over. And the days when faculty and students could blithely expect somebody else to mind the store so that they could do exactly as they wished without a thought to the economic realities in their corner of the Culture Industry are over as well.

Respect for people who hold institutions together is as important as respect for people who build them. The “avant-garde” entrepreneurs who created the great alternative-space system in this country are extraordinary in every way. But the people who manage to operate creatively within big Behemoths—be it Yale or MoMA or the California system or wherever it is—and who hold those institutions together in fair-minded and aesthetically forward-looking ways year in and year out, from budget cut to budget cut, also represent a force for good in our community, although one that is not much recognized or honored. I’m not speaking about myself—I’ve had more than enough attention. I’m pleading for some of the people mentioned earlier in today’s proceedings, some of the people with whom I worked closely and who, as Rodney Dangerfield said, “can’t get no respect,” at least not in quarter where “institutions” are viewed monolithically.

Indeed, one of the principles failings of what passes for “institutional critique” is an inability or unwillingness to address specifics and a penchant for sweeping generalizations that too often brush aside inconvenient historical facts, anomalies, and counterexamples—not least of them the instances in which professionals inside museums, foundations, and universities have fought the good fight and occasionally won it. As someone who has worked in the Behemoths of which I spoke just now, I am an ardent advocate of institutional critique. But institutional critique should “start at home”—as an “inside job”—and what I am trying to do is to foster one in our world, of our sector of the Culture Industry. During most of the last twenty-five years, when the topic has come up, the institutions being critiqued are The Market and The Museum. What has not happened is an institutional critique of the educational system that feeds into the market and museum sectors. That system is the third leg of the stool that holds the art world together and on which the art community sits. On the whole we have evaded examining ourselves too closely, avoided parsing the contradictions of our own positions and responsibilities in too much detail. There’s not been much in the way of an autocritique—and certainly not a Maoist but not even a coherent Marxist one—among academics in this period of time, though many academics subscribe to Marxist ways of thinking. And whenever the issue arises, finger-pointing and evasion begins—because people don’t want the scrutiny. They don’t really want to be told that they are not entitled to many of the things they have come to expect. They don’t want to hear that the things they have come to expect may actually be taken away from them because these things cannot be sustained after all, and that sustaining the essential parts of institutions sometimes requires doing without something one has become accustomed to. Something that may be taken away not by some terrible specter of reaction, though they abound and I am not making excuses for their false economies, but by thoughtful people trying to make the best of a bad turn of events. (In the Academy, crying wolf and making gross comparisons between our troubles and Germany under Adolf Hitler or Italy under Benito Mussolini is frequently the first line of denial when it comes to realities that have shifted beyond anyone’s ability to prevent their shifting.) In practice institutional critique may mean radically resetting or staunchly reaffirming priorities—at a cost. It may mean that a basic, long-haul redistribution of shrinking rather than expanding resources is the business at hand. It means doing that in a manner that is responsible and fair but above all one that is consistent with the fundamental purposes of that institution as well as consistent with the promises implicitly or explicitly made to those who enroll in them and those who work for them. So far as universities are concerned those questions also include the working conditions of those who staff them, which, to a large extent, means relatively recent graduates of the system. How long can American schools maintain their current levels of excellence based on underpaid adjuncts that serve year in and year out, without ever having a realistic chance of rising above their entry-level status? How long can they continue to hand out diplomas without fully informing students of the long odds of their finding ultimately employment in their field? The allocation of labor in the educational system resembles migrant farm labor in far too many respects, and it’s been that way for a long time.

When I talk about “we” who must address these issues in this particular set of circumstances, I am, as I said, talking about the people of my generation. But this “we” has to be subdivided in some important ways. The “we” I am speaking was once overwhelmingly composed of people like me, white Anglo-Saxon Protestant males. It took a scandalously long time for other “we’s”—women, people of color—to break into the monochrome, monogendered, and for that matter predominantly straight art world. And “we” should not pat ourselves on the back for the progress that “we” have all too slowly made because the work is far from done. For example, the Yale School of Art, which has graduated artists of color in appreciable numbers since the 1960s—from Howardena Pindell and Martin Puryear to Kehinde Wiley, Mickalene Thomas, and Wangechi Mutu—but when I became dean there was only one artist of color in a tenured faculty position in the School of Art: Robert Reed, who teaches there still. That had been the situation since the early 1960s when he was hired. Which makes it obvious that none of the discussion of “diversity” since before the deaths of Martin Luther King Jr. and Malcolm X had any effect on the actual representation of African Americans at one of the leading universities in a heavily black North American city. Meanwhile, in 1990 Sheila Levrant de Bretteville was the first tenured female faculty member to be hired at Yale, and due to the paucity of available positions she remains one of only two women with tenure. An equivalent dearth of people of color has prevailed at MoMA, where the only ranking curator of African descent from the 1960s into the new millennium was Kynaston McShine.

I mention these statistics because despite the prevalent talk about a postracial, postsexist society, they describe the abidingly conservative reality within nominally liberal institutions. Inaction does speak louder than words! Institutional critique under such circumstances means more than adding courses on The Other and The Subaltern or exposing the inconsistencies and failures of other parts of the system; it means analyzing our own inconsistencies and failures and then taking appropriate corrective action. Theory without praxis, as thinkers from Aristotle to Marx will tell you, is doomed to failure. It also condemns the disengaged theorist to self-incriminating disclaimers.

A while back Adrian Piper told me that she once asked Rosalind Krauss why she hadn’t written on any artists of color, and Krauss answered that she would if she could identify one of sufficient “quality.” Sound familiar? Like the neocons Clement Greenberg or Hilton Kramer? In any case I don’t believe Krauss has yet found what she was looking for. Indeed Art since 1900—a textbook edited by Krauss, Foster, Yve-Alain Bois, and Benjamin Buchloh—contains two chapters on artists of color, both of which were farmed out to an adjunct member of the team. So far as I know, these chapters were the only ones handled in that manner suggesting that even when the “discourse of The Other” gets its due as “metacritique,” actually paying attention to others is still considered a distraction from the main events of art history.

The other side of the coin of correcting the longstanding exclusivity of “we” is, of course, that once women, people of color, and others who have been kept out of the power structure are finally brought into it, they will join the “we” that has problems to address besides those which may have been partially ameliorated. I would like to mention a few in particular. As I look at the last twenty-five years in this field—by “this field” I now mean the whole of it, whether it is classes taught in art schools or classes taught in art-history departments—I am repeatedly struck by a set of phenomena we usually identify with the financial sector. If we’re going to engage in institutional critique, there’s no way of sidestepping it in our own. Because know without any doubt that spasms of inflation and deflation threaten the world economy, and we know that derivatives aplenty were created out of thin air, resulting in untold risks to the value of everything. We know that dubious financial schemes and absurd social and political propositions were sold to the general public based on predictions that things would turn out a certain way with little regard for what would happen if they didn’t. We know that our entire economy nearly collapsed because of the misuse of abstract ideas and the career-making, brand-building flogging of bogus ideological products.

The same thing has happened in our world, and the potential consequences are equally severe. In the academy we have witnessed the spawning of all manner of theoretical derivatives, that is to say, highly attenuated versions of once-compelling ideas. We have rogue hedge funders too, in other words, speculators who predict one outcome but then as a matter of caution or cynicism bet against the thing that they just predicted. Divided between Utopians convinced that sooner or later the Revolution will sweep away the mess we’re in, and Dystopians who are just as sure that tomorrow we’ll become Nazi Germany—they are nevertheless at pains to secure their institutional futures as if the sky were the limit and there were no clouds on the horizon. As in the financial sector we have heard old “common sense” laws being pooh-poohed, and all manner of generalizations built on suppositions that have not been, and frequently cannot be, proved or disproved. Did a given prediction come true? How might we find out? Even asking such questions calls down the accusation that one is stuck in old ways of thinking or, worse, crudely anti-intellectual.

As a result we have a several generations of students who run around referring to historical paradigms and using ideological catch phrases in the studio or outside the studio, in seminars or outside seminars, with little if any idea of what those paradigms or phrases actually signify. If I’m not mistaken, Ernest Mandel—a Belgian Marxist whose lectures I attended and whose ideas I take seriously (though no longer trust)—was the first person to develop an extended model of Late Capitalism, although there were cranks like Lyndon LaRouche who did as well. Parenthetically, at this juncture I’d venture that at its best critical theory has been a useful analytic tool, though never a useful political tool, but all too often it has been the basis for a genre of what one might call Social Science Fiction—the artistically conservative Jean Baudrillard, being one of its exemplars—with the likes of Larouche and other ideological cultists being the Scientologists of the imploded radicalism of the generation of ‘68.

In any case Late Capitalism, whatever it may have been for Mandel, has not turned out as he predicted it would. Indeed, Frederic Jameson and others influenced by Mandel have backed away from the term, but still it remains part of the catechism of critical theory. In any case, rather than capitalism’s “lateness,” which I take to mean its immanent decline or failure, we are confronted with rampant, astonishingly adaptable capitalism, in particular novel state capitalisms of every description developing in countries that were formerly socialist. I suppose that one could argue that this is just one of the stages along the way to what Mandel and others predicted, but there’s not much to go on in these theories regarding when we will know that we have in fact reached capitalism’s next and “final” stage. Instead of living through the era Late Capitalism, we’re living through that of Late Socialism, and while I take absolutely no comfort in this fact, what I’m trying to say is that using simultaneous vague and grandiose terms of this kind as general descriptions of a complex set of social and economic relationships that condition how culture changes, how it’s disseminated, and how it’s thought about does no good at all and may do a fair amount of harm to the young people trying to figure things out for themselves.

If you have a student who hands you a paper or presents you with a picture and says, “And this is about late capitalism,” the obvious question is, “Well, what do you mean by that?” Of course, most students don’t know because most of them have been taught by professors who bandy about such ideas without having grounded them in economics or history with the rigor that scholars in those fields would require. Most students don’t know how to begin to teach themselves because the history of these ideas has actually been obscured, kept away from them—because the level of abstraction common in art schools and art-history departments has been kept so high, and the game of keep-away has been so perfectly manipulated by a wink and a nod among primary users of these ideas who have no desire to be called to account, that if anybody really asks the stump-the-teacher questions—“What do you mean by that and what examples can you offer? Did the thing you say was going to come about actually happen?”—they are met with silence or with a barrage of obfuscation and brow beating. For obvious reasons, though, such moments seldom come. Students and skeptical colleagues have been too intimidated by the claims of those who introduced these counterfeit notions and built their reputations on their derivatives. Nobody so far as I know repealed the laws of gravity. And nobody repealed the laws of supply and demand. Instead, the antiempirical practices of economic fantasists nearly brought us to our knees: men and women who believe in Ayn Rand the way others once believed in Mao. Critical theorists are playing for smaller stakes, but calling their bluff can amount to the same thing.

Remember the days when it was the Republicans who said, “You are unfortunately a fact-based constituency, and we’re in a brave new world.” I have heard some of my colleagues say, “You are a crude positivist because you ask questions about matters of fact.” Now, I am the son, brother, and brother-in-law of historians. I learned social scientific methods from Marc Bloch to Fernand Braudel at the dinner table. I have a fairly clear idea of what the pertinent historical methodologies—plural!—are, and what their value can be. But so much of what we do or see being done in our field nowadays is not art history at all; it’s ideological historicism. It’s the comparison of one situation to another without regard for obvious differences of moment, of social, cultural, economic, and political setting, of the disparity between aims and outcomes, what should have happened and what did, and without any requirement that theories or speculations be checked against evidence.

Robert Storr (photograph by Bradley Marks)

For example, in the 1980s it was an article of faith in certain quarters that Neoexpressionism signaled a slide into crypto-Nazi barbarism. Moreover, this canard is alive and well in articles and textbooks that fill the bibliographies of courses and seminars. Now, let’s get real! Ronald Reagan was not my cup of tea at all—but he was not Hitler. And Julian Schnabel is not my cup of tea either, though he has made some images I can’t forget and will get credit from me for that—but he is not remotely the equivalent of the painters officially favored by Hitler, whose taste was narrower than that of most Fascists and who mounted the famous Degenerate Art exhibition to condemn Expressionism even when practiced by Nazi fellow travellers like Karl Schmidt-Rotluff and Emil Nolde. Since we’re among art historians, we should be clear while we are at it, that during the 1920s and 1930s the aesthetic Left and Right were easily confused because members of the avant-garde and the arrière-garde kept switching sides and because, apart from Hitler, neither the Right (Fascism) nor the Left (much of Stalinism) had a consistent party-line aesthetic until late in the day. During the 1970s and 1980s things were tidied up by critics eager to rewrite history in the image of their prescriptive norms. In their hands abstraction became ipso facto “progressive” and figuration ipso facto “regressive,” with scant room for nuance. The thing that went unmentioned is that while a large number of figurative painters were on the Right, although by no means all were Fascists, an equal number were on the Left—in Germany, France, the USSR, England, Mexico, Brazil, the United States, etc.—many of them actively so. Meanwhile, a good many of the Futurists, who were, generally speaking, abstract artists, were actively Fascists while as a rule the figurative Metaphysical painters—although backward looking—were not.

In other words, the caricature that skewed art history in the early 1980s was a polemic, but it was a polemic authored by an art historian based on historical examples that were patently false. Why do I mention this? Because, as I mentioned before, those articles are assigned and read today. Because those terms and those examples are used and cited today. Because the abstract, conceptual, and other “neo-avant-gardes” of the 1970s who recoiled in horror at a stylistic sea change and the advent of new generation, creating an overtly distorted “history” for strategic ends. Because when my students in art history or studio programs repeat these falsehood one doesn’t know where to begin to disabuse them, how to disentangle the snarl of conclusions drawn from those dogmas. Because these falsehood are the official opinion of very large parts of our community, and a few if any of those who have disseminated them—and many hold high posts in the academic establishment—have any interest whatsoever in recanting or revising those views, or even just modestly saying “Oops!” Evidently “theorizing” means never having to say you’re sorry.

Lest this sound too sweeping on my part, let me say that T. J. Clark, with whom I disagree on some issues, is an art historian I greatly respect and from whom I’ve learned much. So too was Meyer Schapiro, whom I knew and about whom I wrote. Thanks to him Marxist art history in this country flourished in many influential ways; thanks to him as well psychoanalysis and semiotics entered the field in similarly rigorous and thoughtful ways. I also knew and admired Leo Steinberg, whose work I reread regularly. It is possible to be engaged with all of those thinkers and yet dissent from the uses if not misuses to which their work has been put by people who eagerly seek to legitimize their work by association but who cannot hold a candle to them.

Now let’s consider the matter of predictions. Remember when painting was declared obsolete and photography was held up as superior to it, because as a multiple photography used more modern technology and as a multiple was less subject to commodity fetishism? Tell that to Andreas Gursky, and when you’re done ask Schnabel if his work has ever sold at auction for a price approximating what Gursky gets for a single print from of one of his many editions. Remember the end of the museum? Now an awful lot of the people who were involved in that “discourse” are regulars on museum panels, and quite a few found themselves museum jobs. Indeed, it would seem they always aspired to jobs in museums but while waiting to be tapped they threw stones at the glass house they hoped to inhabit—maybe just to get attention. As a young critic I too did my fair share of criticizing museums—as Larry Rivers and Frank O’Hara long ago pointed out, it goes with the territory—but I never intended to work in one and turned down a MoMA job when Kirk Varnedoe first offered it to me, never expecting to be asked again. When asked a second time and after canvasing artist friends about whether to accept the offer—Félix González-Torres’s encouragement was the decisive vote—I discovered that museums are pretty interesting places to work after all. I also discovered that some of the most radical people around were actually working in museums already. By which I mean people who made substantial change happen while others just talked about it. Barbara London, for one example; Deborah Wye, for another. Both were long-term curators at the Modern who had without fanfare opened its doors to new media, new artists, and cultural diversity. So I applaud dissenters going into museums, just as I respect radicals who teach, because I look forward to having colleagues both inside and outside institutions with whom to work on the project of changing them—because if you have alliances across that membrane, you can do remarkable things

Nevertheless we keep hearing the same the screed that The Museum is inherently the enemy of art, that The Museum is inherently the toy of the rich—as if everyone with wealth was of the same opinion about culture and politics, which I can assure is not the case—that The Museum is inherently inhabited by lackeys of the Cultural Industry whose only agenda is to advance their careers and the special interests of those who hire them. Well, I am not going to say that museums aren’t in trouble—they are. I use the plural because they are not all the same, in mission, in history, in location and hinterland, in resources, or in flaws. But individual agency and collective action can make and has made a difference. At times dissent compels one to break ranks. I am doing that here in the academic world. And at times one must step away to make change. On the same score I think I am safe in saying that I’m the only person in this room who has walked away from a senior curatorship at the Museum of Modern Art because I didn’t think I could do the job under the current administration the way it needs to be done. I am not boasting, but sometimes “institutional critique” requires taking actions that have painful consequences.

On to other things: whose opinions does one take seriously? About two weeks ago, I was invited to a restaurant in Paris by friends, who had also invited Alain Badiou. I’d read some of his writing and we had an pleasant conversation—or rather I listened to him expound in a bonhomous way, mostly about very old Leftist ideas, and we ate a couple of rich steak frites with good red wine. And as the talk of revolution proceeded I was reminded, as I have been before, of a passage in Gustave Flaubert’s Sentimental Education in which he describes how the radical nature of the post-1848 generation was ruined by rhetoric and good dinners. About ten years ago I had lunch with Paul Virilio, during which I listened as he spoke marvelously about megalopolis and terrorism. I was captivated, as one can easily be, by his carefree theory spinning and his elegant turns of phrase. Indeed, his gift for juggling words and ideas is really very suggestive, and there are uses for that kind of thing. But then after he’d been talking about the modern city for a hour or more I said, “Listen, Paul, when you’re next in New York, would you look me up?” At which point the man who was our host said, “You know, he’s never been to New York. He doesn’t fly.” Which means he’s also not been to Mexico City, São Paulo, Tokyo, Mumbai, Beijing, or Los Angeles. Yet he is an expert on megalopolis.

Many years ago, Ezra Pound wrote in his ABC of Reading that generalizations are checks written against the bank of knowledge. The question always is: what is there to back them up with? Criticism has been kiting checks in our world for the last quarter century. It’s time, I think, to send the collection agency around and ask what is behind some of these generalizations. Not in order to destroy the fundamental points being made by the people that have bounced so many theoretical checks—often their ostensible aims are legitimate—but to make those points better, to make them work, to intellectually restore the currency we use.

The radical impetus behind much critical theory is inarguable and positive, but that positive impetus has been corrupted by cavalier if not frankly dishonest casuistry, and above all by historically untethered interpretation of the key concepts. There is also a dark side to such practices. Slavoj Žižek is all over the place, talking up the marvels of revolutionary violence and political terror. But this is sinister sophistry in a world in which bloody terror is practiced with equal callousness by insurgents and by states that could not care less about Jacobin discourse. Once again, as in the 1960s and 1970s, we are reliving the cycle that goes from idealism to terror, from Ban the Bomb to the Red Army Faction, from the Student Nonviolent Coordinating Committee to the Symbionese Liberation Army. In that context it is highly problematic that Žižek should be held up as an intellectual hero encouraging people to be bravely destructive, following examples set in distant historical circumstances—Maximilien de Robespierre and the French Revolution, Leon Trotsky and the Russian Revolution—without really taking responsibility for how this resonates with people who live in places where terrorism has become a harsh reality and where the responses to terrorism, to the hysterical counterterrorists, are equally dangerous to us all.

Going back to art history proper—and reflecting on all the places where Virilio has not been but where a good many of us have been—we need a whole new generation of scholars ready to write serious critical, analytical, synthetic, even narrative art histories about what happened in Mumbai, what happened in Johannesburg, and what happened in a host of places where there had been active art worlds for over a century, or at least a for “short century.” The Latin American art world is as old as ours. The year 1913 is the date of the Armory Show that kicked off modernism in this country; if I’m not mistaken the Week of Modern Art in 1917 did the same in Brazil. Yet there really isn’t a readily available, thoroughly integrated history of art for our students that has both of these developments running side by side as it moves forward to the present, such that the southern and northern hemispheres of this part of the world are treated holistically—not “globally,” as I hate that term, which should be given back to business and the military—and treated across the board as postcolonial history. The United States is a postcolonial country, no less than Brazil is a postcolonial country—or Mexico, Cuba, Peru—no less than all of the Caribbean and all of South America, and no less Canada. My wife is Canadian, and she keeps telling me, “Don’t forget us!” And we shouldn’t. So that when we talk about Georgia O’Keeffe we also think of Emily Carr and Anita Malfatti, not as add-ons to existing histories of the avant-garde in New York and Paris, as has been the custom, and not as women belatedly inserted into the canon to correct its gender bias, although that much needed to be done, but rather as protagonists integral to the story of the origination of modern art worldwide.

Thinking in such cosmopolitan terms isn’t just a backward-looking project. It’s forward looking too—and I address these next remarks to those of you on the studio side. Because, if you are currently an abstract painter and somewhat unsure of whether there’s a place for you in the scheme of things, maybe you should be looking at Argentina, Venezuela, and Brazil, as well as Central Europe, to find aesthetic examples that correspond to your interests, examples that will teach you and challenge you to do more with what you’ve got. Work by artists you’ve not previously heard from but who are every bit a part of the modernist project as those to whom you’ve been overexposed.

In sum, we have a lot to teach that we have not taught, but we have an awful lot more to learn before we even begin to teach. And all of us have to rid ourselves from the presumption that this is merely an adjustment or retrofitting of our existing models. It actually entails fundamental rebuilding of all of them, which requires dismantling them first without becoming fixated on negation as an end in itself. We have to disenthrall ourselves from ideological generalizations that have bedeviled exchange among us and slowed the process of resetting our compasses. We have to wean ourselves for a time from speculative thought as a substitute for actual research. We have to do what Arthur Rimbaud calls for in his prose poem “Adieu,” in A Season in Hell, which recounts his decision to foreswear the intoxicating ether of Symbolism and come back down to earth in order to become “absolutely modern.”

In conclusion, I’m going to read a short passage that I’ve included in a couple of essays and reread often myself, and then end with one short anecdote. It won’t take more than five minutes, so I hope that’s all right. It’s a passage by one of the greatest and most studied modernist critics that, strangely enough, is seldom cited. Maybe once I’ve read it you’ll understand why that’s the case. It’s from Charles Baudelaire’s review of the Salon of 1855. He wrote:

Like all my friends I have tried more than once to lock myself in a system so as to be able to pontificate as I liked.

Parenthetically, think of all the books on all the critical-theory shelves at Saint Mark’s Bookstore, books of which most people who buy them read just a chapter or two, only to move onto the next one, and the next, generally without going the distance in any of them. And generally, without admitting to themselves or anyone that they were just grazing in the first place—no shame in that, I suppose; Walter Benjamin did it but he confessed—until they took up another fashionable idea. Now back to Baudelaire:

But a system is a kind of damnation that condemns us to perpetual backsliding. We are always having to invent another and this is a form of cruel punishment. Every time some spontaneously unexpected product of universal vitality would come and give lie to my puerile and old-fashioned wisdom, much to be deplored daughter of Utopia, in vain did I shift or extend criteria. It could not keep up with universal man.

By the way this is the middle of the nineteenth century, so I’m not going to anachronistically regender his text, but you understand what he’s saying.

It could not keep up with universal man. It was forever chasing multi-form multi-colored beauty that dwells in the infinite spirals of life. Under the threat of being constantly humiliated by another conversion, I took a decision—to escape from the horrors of these philosophic apostasies, I arrogantly resigned myself to modesty. I became content to feel. I came back and sought sanctuary in an impeccable naivete. I humbly beg pardon of the academics of any kind who inhabit the different workshops of our art factory….

Baudelaire wrote this a hundred years before Adorno:

for only there has my philosophic conscience found rest, and at least I can now declare insofar as a man can answer for his virtues that my mind now enjoys a more abundant impartiality.

Miwon Kwon just spoke about the book that won her an award here today and explained that she had embarked naïvely on a project that bore enormous fruit. I’ve read parts of her book, and it is indeed an example of what rigorous thinking can actually accomplish. But that initial spark of naïveté is the essential element. Or as Sol LeWitt said in “Sentences on Conceptual Art,” the first step in the process is to make a leap of faith—which in his view qualified Conceptual artists as mystics. That said all subsequent steps must be rigorous and methodical. Making art that way is not highly touted these days, and that idea of thinking critically about art in such a manner is even less so. So let me end with a brief story, and I hope I don’t embarrass the colleague about whom I am speaking in doing so.

When I was at the Institute of Fine Arts, I was befriended by Günter Kopcke. Günter is an old-school German scholar of the classical antiquity, a remarkably erudite man who in addition to his study of ancient art has maintained an active interest in contemporary art. It was he who gave me copies of the catalogues from the second installment of Documenta in 1959. I had been looking for them and said so. He had them but didn’t want them anymore and just handed over these treasures to me. It was an act of great kindness and solidarity. Günter and I had lunch from time to time, and once he seemed pretty depressed. He was very worried about the institute, he was worried about the field, but mostly he was worried about his research. I could feel a kind of melancholy taking hold of him, and I was really quite concerned about what might happen, what would he do.

Later he came to me greatly cheered up and said that he had concluded that he was stuck. He didn’t have a hypothesis or project to pursue. He didn’t know what he wanted. So he decided to use his upcoming sabbatical to go to the Acropolis and walk the grounds from end to end until an idea came to him. His plan was no plan, beyond returning to the place he had studied for most of his life, a place about which he knew the literature better probably than anybody, in all the languages that he speaks. As learned, as thoughtful, and as subtle a man as he could possibly be, he elected to go unburdened by any expectations in order to reconnect with something that had once inspired him. It was a Baudelarian wager. And it worked.

I think basically we’re at that stage again in a lot of fields. I think a lot of us need to just set down all the things we think we know, all the books we have read, all the pictures we have seen, all of the formulations we have perfected for lectures—by the way, I’m getting out of the lecture business, so you’ll be glad to know this is one of my last—and begin to think afresh based on renewed primary experience of something that exists in the world. As in Ellsworth Kelly’s case, it can be a matter of noticing something while looking out the window of a speeding car and wanting to make something from that glimpse—for example the shadow of a half-open garage that you drive by. Or it can be a particular detail of architecture that you realize nobody has ever talked about before. Or it can be an unusual turn of phrase—one of the marvelous things about Leo Steinberg was that he was as avid a reader as he was a looker, and he horded a wealth of verbal gems with a space set aside in his mind for the most memorable nonsense. He read like a writer, not just in order to grasp an argument or find fault with one, but for the inflexions of language, for the pleasures of the text.

In summation all I would simply say is that for a lot of us the work ahead really consists, number one, of taking full responsibility for the fact that “we” have made this corner of “our” art world—not somebody else, not some other generation, not some sociological deus ex machina. “We” have made this art world and, number two, it does not work. And we should never lose sight of the fact that some of the people who have made other parts of the art world are our allies rather than our enemies. There are people in museums I trust more than those I trust in my own field. There are gallerists whom I trust more than people in my own field. My community is made up of people who have reason to trust one another because they have a common interest in making the most of something to which they have decided to dedicate their lives. That many decided to dedicate their lives to that thing before they had any professional or career aspirations or prospects whatsoever. Many are doing it now—a younger generation—after realizing that they have very limited career prospects and few reasons to expect they’re going to make a living out of it. That is where we are, and we need to look at it squarely. We need to be very severe with those who will persist in distracting us ideologically and those who try in vain to persuade us that it’s really up to somebody else to fix it. It’s up to “us.” Thank you.

Filed under: Uncategorized — Tags: