[In Review] Literary Studies and ‘The Next Big Thing’

According to a new article in Stanford University News, researchers studying the brain patterns of PhD students have created fMRI images that “suggest that literary reading provides a truly valuable exercise of people’s brains.” The article barely avoids saying that, for scholars of literature, their findings couldn’t have come at a better time: if we are no longer allowed to believe that literature can tell us things worth knowing, then we should be glad if science proves that it’s good for brain-maintenance.

Of course, the study isn’t the first to pair literary reading with neuroscience or even fMRI technology. New York Times article from 2010 about the “next big thing in English” discusses several others, including a study at the University of Kentucky researching the novel’s “levels of intentionality” in light of evolutionary psychology and the Literary Lab at Stanford University, where Professor Blakey Vermeule is studying evolutionary explanations of free indirect discourse.

So what are the benefits of this kind of research? For some, it’s all about the bottom line. “To Jonathan Gottschall, a scientific approach can rescue literature departments from the malaise that has embraced them over the last decade and a half.” The humanities are dying, and what they need is an injection of The Next Big Thing, something to make their particular kind of inquiry sexy and relevant in an age seemingly dominated by the presence of science.

There is something somewhat suspect, though, in that very eagerness for “the next big thing,” because a next implies a next implies a next, a continual struggle just to survive. And if the tools of evolutionary biology can teach us anything, it’s that a species truly suited to its environment doesn’t just survive; it thrives.

Marco Roth, asked to respond to the Times article, seems to have this same concern in mind. “Learning about which part of your brain lights up when you come across a passage of free indirect discourse seems less interesting to me than learning what free indirect discourse is, how and when it emerged, and why a novelist might choose to use it, as a free and conscious choice. Teaching and learning such things may not help you find a mate or even get tenure, but they’re still as much a part of what we know and how we know as our neurotransmitters, even if cash-strapped universities seem determined to forget about them.”

Scientific studies of literature often do seem to have this trait in common, that they only tell us what we already know; for readers of literature the suggestion that fMRI scans reveal “how the right patterns of ink on a page can create vivid mental imagery and instill powerful emotions” seems tautological. However, this is not to disparage science, or its approaches to literature. After four decades during which literary studies routinely conferred on itself an authority to pronounce on all fields—history, political theory, sociology, epistemology—it now is asked to, and sometimes is happy to, abject itself before natural science. And this reversal is not an irony, but a logical development. A rigorous and rational literary criticism, alert to its disciplinary limits and authoritative within them, is the only intellectual offering that scientists, or anyone else, will be interested in from literary scholars—or should be.

[In Review] Who Killed the Liberal Arts?

So wonders Joseph Epstein in a recent article published by The Weekly Standard. Using Andrew Delbanco’s College: What it Was, Is, and Should Be as a guide, his answer meanders through clues in the current state of higher education as well as autobiographical accounts of his undergraduate experience at the University of Chicago and as a lecturer for 30 years at Northwestern University.

But Epstein has no aspirations to Sherlock’s genius. The culprit, as he somewhat predictably reveals it, is a faculty that “themselves no longer believe in” the liberal arts and their “soul-saving” powers. He would even seem to include himself among those disenchanted: “For many years the liberal arts were my second religion. I worshipped their content, I believed in their significance, I fought for them against the philistines of our age…As currently practiced, however, it is becoming more and more difficult to defend the liberal arts…and defending them in the condition in which they linger on scarcely seems worth the struggle.”

But this isn’t the whole story for Epstein, as we come to find 28 paragraphs later when he finally makes it clear that what he meant by “teachers” is actually other teachers, “the guys in the next room” who “in their hunger for relevance and their penchant for self-indulgence” teach “deconstruction, academic feminism, historicism, Marxism, early queer theory, and other, in Wallace Stevens’s phrase, one-idea lunacies.” For Epstein, these areas of inquiry constitute a break from “a general consensus…about what [was] qualified to be taught to the young in the brief span of their education.” “What gets taught today,” instead, he remarks, “is more and more that which interests professors.”

In the end, Epstein’s view is cynical to the point of futility. By the time he begins writing, the liberal arts are already dead. And as it turns out, the death was no murder mystery, but just another “decisive battle” in the culture wars, which, Epstein strongly believes “we lost.” But who gives up this easily on his “second religion”? Epstein may be correct about the lack of faith in the liberal arts by those who teach them, but this makes everyone—himself included—complicit in their decline. To declare the liberal arts a battlefield of the culture war is already to surrender, to give in to the idea that the conclusions we draw from them are ideological strategies. The liberal arts are not dead, and we need to stop behaving as if they were. Instead we ought to submit ourselves to their study and state our belief in them straight out; that way we might have liberal arts whose relevance we won’t have to fight for.

[In Review] For Michael Sandel, Money Does More than Talk

Michael Sandel makes a connection between “market triumphalism” and “moral vacancy,” in an interview about his new book, What Money Can’t Buy. According to Sandel, economics has cast itself as a value-neutral science, and in trying to maximize value without judging values, we have let economics decide the value of too many things — our bodies, human dignity, teaching and learning. Market-determined evaluation inhibits human flourishing.

Sandel has a point: value-neutral assessment is impossible in the human sciences, since value is what assessment assesses. But this tussle over precedence — the political philosopher telling economists that they need more political philosophy — seems a curious way of showing it. It invites unwelcome counter-punching (an economist might suggest that Sandel could use a clearer grasp of economics); and it invites piling on (Molecular and Cell Biology might urge that it has more to say about “our bodies” than the political philosopher; the Education School might challenge the Government Department professor’s turf-claim on “teaching and learning”). More than these, it misses the chance to explore with economists how their disciplinary rationality implies and includes and can acknowledge the range of values humans acknowledge. Explorations like that can be a slow business, and perhaps do not bring Harvard video podcasts. But they serve the common dignity of thinking, and have a chance to serve the common good.

[In Review] The n+1 Editors Tell You to Burn Your Degree

The editors of n+1 come out against the “credentialism” of the American intellectual and professional elite, equating investors at Goldman Sachs with members of the American Medical Association, and claiming that both—as representatives of their respective wider classes—are deserving of “populist hostility.” However, their concerns about the very real problems facing liberal education in its current form are subverted by the hyperbole of their solution.

These critics pose hard questions that (let’s be frank) have a large measure of justice: What is education good for? Do academics debate anything that matters? Do graduates leave universities prepared for employment and life in an increasingly dynamic world? And they have a hard solution: they wonder whether “a master’s degree…burns brighter than a draft card” and imply that by taking an extreme stance against the University, they will have greater ability to change a society chronically stratified. But change it into what? The trenchancy and subtlety of their own analysis suggests that populist instinct alone does not produce analytic tools. Just nudge their passionate thinking a few degrees closer to plumb and it would suggest how we might take higher education more seriously than it often takes itself.

[In Review] The Hand that Inflicted the Wound Can Cure the Disease

Debates about the values of liberal education did not begin with our financial crisis. The new university system of the late 19th century forced the old college system to defend its mission. “Of what merit was general education amid a pulsating scientific-industrial civilization that increasingly prized the values of professionalism and narrow expertise?” writes Richard Wolin in review of Andrew Delbanco’s College: What It Was, Is, and Should Be.

According to Wolin, Delbanco’s point in his chronicle of the American university is “that by subjecting the ends of higher education to a series of extraneous criteria derived from the marketplace, we risk distorting the very purpose and meaning of the college experience.” We all can agree that liberal education is good; the challenge is to decide what it is and what it’s for. “The end result” of not deciding “has been the confused intellectual smorgasbord that defines undergraduate study today.”

Wolin concludes by wondering whether some old assumptions about the Western intellectual tradition, now sidelined, have been unjustly sidelined: “As the Frankfurt School philosopher Theodor Adorno observed about debates concerning the legacy of Western reason: only the hand that inflicted the wound can cure the disease.”

[In Review] Do the UCs “Indoctrinate” Students? (Does it Matter?)

The highly-politicized conversation regarding Berkeley’s “indoctrination” of students grows odd when it lectures about things outside the realm of the political.

A recent Wall Street Journal opinion piece claimed that “the politicization of higher education” at the University of California “deprives students of the opportunity to acquire knowledge and refine their minds.” But the piece itself can see education only through a political lens. (It worries how many Democrats and Republicans are housed in the UC Berkeley English and History departments.)

Two Berkeley professors responded indignantly to the false claim that Berkeley does not teach American History or Western Civilization courses (linked here and here)—but both, in frameworks themselves politicized, leave the practical and intellectual issue largely unaddressed. By what means do students receive the skills to weigh competing discourses? What larger intellectual structures order the knowledge that students receive—including the politicized, historical anecdotes that both professors cite?

One of the responses rightly asserts that a university education is more than an inventory of courses—and that “what matters is what we teach students to do: in our case, read, think, write, and decide for themselves.” But on what grounds do students make those decisions? Yes, education is more than an inventory of courses—but is the question of content frivolous? Is any topic as good as another? Telling students we should “decide for ourselves” intellectually is no more than an empty compliment if we’re told next that it doesn’t matter what we decide.