Because Survival is Insufficient

Note: since drafting this I’ve come across an interview with Emily St. Mandel in which she calls the line that gives me my title “almost the thesis statement of the entire novel.” Her interview resonates in many lovely ways with what I write below; if I revise this I’ll weave some of her comments into my ruminations.

Rereading Emily St. John Mandel’s novel, Station Eleven, for a class this semester, I am struck by the motto of the novel’s Traveling Symphony: “Because survival is insufficient.” This motto—and the symphony itself—are reasons I believe this book’s post-apocalypse more than most—and I have read, watched, and played many post-apocalyptic narratives (I wrote my dissertation on nineteenth-century American apocalyptic thinking, and the larger genre remains one I cannot resist). After the end, art would survive. Some decency would survive. I don’t think that’s naive. Human history has been often brutal and yet, in every era: some art, some decency.

More immediately, however, the ringing truth of this motto speaks to why I’m continually horrified by calls to shut down arts and humanities departments, cut music and arts programs in primary education, or defund institutions like the NEA or the NEH. These moves are often masked under calls for “fiscal responsibility,” though the budgets for such programs pale by comparison to just about anything else: fractions of fragments, that in cutting one barely moves the needle. When politicians argue for such cuts, they usually aren’t, actually, doing so for fiscal reasons. I don’t distrust the motives of local school boards in the same way, but when arts and humanities are first to be cut it does say a great deal about our collective psyche, about how we weigh mere survival against thriving.

What I see in such calls are impoverished notions of human lives and human flourishing. I see claims that survival alone is sufficient. That education can only be vocational training. That the highest good is keeping the current economy tick, tick, ticking away. That we cannot bring ourselves to imagine better, or fairer, or more beautiful.

As both a professor and a parent, I’m keenly aware of a paradox in the way parents think about these things. When our children are children, we push them towards the arts—music lessons, drama camp, creative writing classes—and revel in their burgeoning creativity, beaming through cacophonous recitals and school plays, lavishing praise on varicolored paintings and lopsided sculptures. We brag about how much they read, and how young they started. We’re all a little jealous of that parent whose kid is more excited by Perseus than by Pokémon. When our children are children we grasp, almost intuitively, that imaginative and intellectual engagements are not simply nice embellishments, but central to them becoming fully alive in the world. We know that the screech of their violin signals new-firing synapses, that it will pay dividends we may not be able to fully account.

But if a kid is still excited about Perseus in college, then we start to get nervous: shouldn’t she be putting away childish things? If he still spends all day reading, we begin to suggest that maybe some coding would balance things out. We push them to track, to professionalize, to be practical. At this point, defenders of the arts and humanities might feel inclined to mention someone like Mark Cuban claiming liberal arts graduates are the future, to make an argument that broad education is actually the best training for an uncertain but changing marketplace. I think Cuban is right in this instance, and he echoes a long chain of business and technology leaders making similar claims.

Maybe such arguments convince a few nervous students or parents. I suspect they don’t convince many parents, though, who will remain firmly convinced—contrary to much evidence—that computer science or business are sure tickets to economic security. And there’s nothing wrong with, say, coding. I do a lot of it myself, and even teach it to English students. But the reason I teach coding isn’t to help students secure tech jobs. I do it to introduce new ways of thinking about problems, and ultimately to give them new avenues for creativity. The fact is that there is no sure educational track toward economic security, but more importantly, there is no sure link between economic security and happiness, or fulfillment. I’m not here arguing that students would be happier and more fulfilled as English teachers than as investment bankers, I’m arguing that both English teachers and investment bankers might be happier and more fulfilled in a society suffused by history, literature, music, and art. I’m arguing that the novelties of such a world would spark both innovation and joy, in ways we cannot predict a priori.

There’s a foundational defense of the arts and humanities that we ignore when we concede economics as the only premise to the argument. Discovery is intrinsic to human flourishing. This is to my mind the best reason to defend—and to fund—the arts, the humanities, or, for that matter, space exploration. We are never content with “that’s just how it was,” “that’s just how it is,” or “that’s just how it will be.” We are a curious species. We want to see more, to learn more, to understand more, and—yes—to make use of more. That latter clause signals how this human drive plays out in both good and bad ways, but we cannot ignore it.

In the sciences, “pure research” never guarantees immediate, tangible economic or societal benefits: it is by definition exploratory and incremental. Nevertheless, pure research very often results in the most unexpected and transformative discoveries, those that a more presentist, pragmatic approach never would have considered. To construct a scientific research paradigm around only those studies that seem immediately, economically useful would be to commit ourselves to living only in our present. Settling on a science without risk, we would forfeit our future. When a politician derides supposedly abstruse or useless scientific studies to justify budget cuts, we should not hear an indictment of the scientists they denigrate. Instead, we should hear that politician delineating the narrow boundaries of his imagination.

The final passage of Station Eleven makes the link between the species of imagination clear. One character, Clark, has taken up the role of curator of “the Museum of Civilization,” and attempts to convey the history of a civilization lost to children who never knew it. The novel ends with Clark’s ruminations:

He has no expectation of seeing an airplane rise again in his lifetime, but is it possible that somewhere there are ships setting out? If there are again towns with streetlights, if there are symphonies and newspapers, then what else might this awakening world contain? Perhaps vessels are setting out even now, traveling toward or away from him, steered by sailors armed with maps and knowledge of the stars, driven by need or perhaps simply curiosity: whatever became of the countries on the other side? If nothing else, it’s pleasant to consider the possibility. He likes the thought of ships moving over the water, toward another world just out of sight.

Here “symphonies and newspapers” become catalysts for imaging a more humane world of discovery, knowledge, and curiosity. By their mere existence, they allow Clark to think more capaciously than he otherwise could. Re-discovering the past makes possible a different and better future.

Our collective future should not be foreshortened to the bounds of a few politicians’ stunted imaginations. Arts and humanities enlarge the imagination: they help us consider other people, other cultures, and other possibilities for ourselves. We think with and through our histories, with and through our stories. Arts and humanities enrich not—always, or only—our coffers, but our culture, and this is a fundamental good worth defending. They do this for those who make their careers working in the arts, but they do this too for people who make their careers in science and technology. They do this for people who believe their lives should comprise more than work. They do this for people who want the work they do to be meaningful and to persist beyond their short lives. They do this for people who want to live, not merely survive. Because survival is insufficient.

Links from Graduate Fellowships Workshop

On October 2 I led an informal workshop for Northeastern English Graduate students about the process of applying for fellowships to support their research and/or teaching. These are some links I thought they might find useful in the fellowship discovery and application process:

In addition, I’m happy to provide my application materials for the Andrew W. Mellon Fellowship of Scholars in Critical Bibliography at the Rare Book School.

Mea Culpa: on Conference Tweeting, Politeness, and Community Building

Kathleen Fitzpatrick’s post “If You Can’t Say Anything Nice” post about public shaming on Twitter came at a timely moment for me. Describing the culture of Twitter commentary, she writes:

You get irritated by something — something someone said or didn’t say, something that doesn’t work the way you want it to — you toss off a quick complaint, and you link to the offender so that they see it. You’re in a hurry, you’ve only got so much space, and (if you’re being honest with yourself) you’re hoping that your followers will agree with your complaint, or find it funny, or that it will otherwise catch their attention enough to be RT’d.

I’ve done this, probably more times than I want to admit, without even thinking about it. But I’ve also been on the receiving end of this kind of public insult a few times, and I’m here to tell you, it sucks.

I read this post while at a conference, and as I read it realized that I’d been guilty of just this kind of ungenerous commentary earlier in the day. I’d disagreed strongly with one of the presenters and written a series of critiques on Twitter, which many in my community found pithy and retweeted. Let me say: I absolutely believed in what I wrote, and I don’t retract the ideas. But in the Twitter exchanges around those posts, some of the conversation got more personal. The presenter—a fellow academic and human being named Elaine Treharne, not some nameless person‐read those exchanges after the panel and was deeply hurt by them. She was right. I was wrong. I tweeted an apology, but the entire affair, coupled with Kathleen’s post, kept working on me. I ended up chatting with Elaine for several hours yesterday evening about electronic fora, professionalism, and valid critique through channels such as Twitter. I think we both learned quite a bit; I know I learned quite a bit. We still don’t entirely agree on the substantive points from her presentation, but I hope we’re now friends as well as colleagues. She agreed to let me use her name in this post.

After yesterday’s experiences and conversations, I spent the evening considering my tweets over the past several conferences I’ve attended, including in the much-ballyhooed “Dark Side of DH” panel at MLA in Boston. Kathleen is absolutely right: our field needs to seriously consider both how our current Twitter culture developed, and how it might need to change moving forward. I need to seriously consider how I engage with colleagues on Twitter; I am not blameless and I need to reform. This post is my attempt to start thinking through both how the current Twitter culture came to be and where how we might change. The post owes any of its insights to Elaine’s generous willingness to talk seriously with me about these issues after being flamed by my community on Twitter.

Only a few years ago, DH was still a fringe field, mostly ignored by academia more widely. DHers felt not like “the next big thing,” but like an embattled minority. The community was very small, and the worry at conferences was about how to convince our colleagues that what we did was valuable. How can we get hired; how can we get promoted? How can we persuade the field to pay attention to this work we find remarkable? DHers were overrepresented in online fora such as Twitter, though, which became a place to build support communities for DH scholars who felt isolated on their campuses and within the wider academic community.

Within that framework, the back-chatter on Twitter was a valuable support mechanism. I remember sitting in a conference panel in my disciplinary field—nineteenth-century American literature—a few years ago when an eminent professor described the utter vapidity of modern reading practices (uncharitably: “kids these days with their screens! and their ADD!) compared to those of 150 years ago. Around the room, heads were nodding vigorously, and in the Q&A many other prominent members of my field rose to concur.

In that room, I felt like the oddball. My intellectual interests were being dismissed out of hand by the very people likely to decide whether my work would be published (and thus, whether I would get a job, get tenure, &c., &c.). I disagreed with them vehemently, but as a junior scholar was hesitant to challenge the rising consensus in the room, for fear that would further isolate me. And so I turned to Twitter to remind myself that I did have a community who would welcome my ideas on these issues. I tweeted my frustrations—I conferred with my dispersed but friendly DH community—and found support and engagement. Perhaps this doesn’t excuse public snarkiness, but that snark was a way of building community—certifying the value of unpopular interests and opinions. None of the eminent panelists from that session I attended read those conversations, nor would have. Nobody got hurt, and I felt less embattled and more prepared to go on with my work.

But that was several years ago, when I had far fewer followers on Twitter, and when DH was not at the center of the academy’s attention. Today many more academics, including those not heavily involved in DH, are on Twitter. And rather than being an nearly-ignored, fringe element of the academy, prominent DHers are being looked to as gatekeepers into a much-desired field. Panelists know to investigate how their sessions were tweeted, and they care what was said about them online. What’s more, many of our colleagues now know how to find tweets about them even when those tweets don’t include their names or usernames. We cannot assume that anonymous tweeting will do no harm to the colleagues we discuss. Tweets are not semi-private, whispered conversations in the back of the conference room; our tweets are very public and could unfairly shape public perception of the colleagues we discuss in them.

Within this framework, the same kind of Twitter chatter that helped build DH communities only a few years ago can resonate with newcomers to the field precisely as that vigorous denunciation of “technology” resonated for me as a young nineteenth-century Americanist. In other words, Twitter chatter can easily read not as community building, but as insider dismissal and exclusion. Such exchanges belie claims that DH is an open field, instead alienating scholars attempting to engage with it. We are no longer the upstarts; we are increasingly seen as the establishment. While this perception doesn’t exactly line up with reality, it certainly shapes the way our Twitter conversations—and in turn the wider DH field—are perceived by newcomers to it. In Elaine’s case, she felt she was being dismissed out of hand by scholars whose work she knows and respects; we had convinced her that she didn’t belong in DH. This is a terrible outcome our field should be wary of replicating.

Nevertheless, I remain firmly convinced that Twitter conversations can supplement and enrich academic conferences, providing a record of their proceedings, allowing scholars to engage actively with their presenting colleagues, and providing access to conferences to those scholars who cannot attend. But as a community, we need to think hard about how to retain the value of conference tweeting while mitigating the alienating effects of conference tweeting on our colleagues. This does not mean, I think, refraining from any critique on Twitter, but will mean remembering when crafting those critiques that there are real people on the receiving end.

Principles of Conference Tweeting

Going forward, I’m going to try and tweet conference panels following these principles.

  1. I will post praise generously, sharing what I find interesting about presentations.
  2. Likewise, I will share pertinent links to people and projects, in order to bring attention to my colleagues’ work.
  3. When posting questions or critiques, I will include the panelist’s username (an @ mention) whenever possible.
  4. If the panelist does not have a username—or if I cannot find it—I will do my best to alert them when I post questions or critiques, rather than leaving them to discover those engagements independently.
  5. I will not post questions to Twitter that I would not ask in the panel Q&A.
  6. I will not use a tone on Twitter that I would not use when speaking to the scholar in person.
  7. I will avoid “crosstalk”—joking exchanges only tangentially related to the talk—unless the presenter is explicitly involved in the chatter.
  8. I will refuse to post or engage with posts that comment on the presenter’s person, rather than the presenter’s ideas.

I am not calling for an embargo on conference tweeting, or for engagements exclusively devoted to agreement or confirmation. To turn conference tweeting into a tepid, timid echo chamber would not serve DH or the wider academy. But as the DH field grows and newcomers attempt to engage with it, we must consider the effect our chatter might have on them. I don’t want to make newcomers to DH feel as isolated as I felt in that room of eminent Americanists. Changing my public presentation on Twitter seems a small concession—worth making—if it will prevent that happening.

Thanks to Flickr users digitalART2, exquisitur, and brx0 for the Creative-Commons photos embedded here.

Creating and Maintaining a Professional Presence Online: A Workshop for Graduate Students

Today I’ll be running a workshop for students in Northeastern University’s English Graduate Program on “Creating and Maintaining a Professional Presence Online.” This is an essential topic for scholars entering the field today, but it’s rarely addressed in any formal way by departments. The decision to take one’s scholarship online (or the decision not to) both have real consequences on the job market and beyond.

As I did before our job market session a few weeks ago, I turned to colleagues online for help finding useful articles or blog posts on the subject. Here are the links I’ll be passing on today:

  1. If you read only one post, I would recommend Jentery Sayer’s “Do You Need Your Own Website While On the Job Market?” post at ProfHacker. It’s a thorough piece that discusses the pros and cons of maintaining a professional website, while also providing some guidance about how to get started.
  2. Phil Agre’s decade-old “Networking on the Network” remains well worth a read—indeed, the points he makes about email are only amplified by the growth of blogs, Facebook, and Twitter in academia. Here’s a particularly salient paragraph: Continue reading

Useful Resources for the Academic Job Market

Later today I’ll join a workshop for graduate students in Northeastern University’s English Graduate Program who are making (or considering) a run on the job market. As a recent survivor of the market I hope I can offer some insight into its quirks and vicissitudes. To that end—and with the help of several colleagues on Twitter—I’ve compiled a list of useful articles for students embarking on the academic job search.

  • Brian Croxall’s “Preparing Now for Next Year’s Job Market” is pitched as a help for students preparing over the summer. Even with summer now gone, however, the post provides a useful summary of most materials students will need for academic job applications.
  • Thanks to Travis Foster for pointing my attention to William H. Wandless’ practical, detailed, and insightful posts about the job application process: “The Academic Job Market: English Search Advice” Part I, Part II, and Part III.
  • Continue reading