Slack Chat: Conferences

Tags

, , , ,

Chris Bouton: For this week’s slack chat, we’re going to continue our discussion of academic conferences. On Monday, I wrote about the problems of being an introvert at academic conferences. On Tuesday, Erin wrote about the invisible (and unpaid) work that goes into conference papers. Yesterday, David wrote in defense of the academic conference.

Let’s get ready for all the hot conference takes.

David Mislin: Conferences are too expensive. That’s my hot take.

Chris: They present barriers to attendance that is difficult for many people to attend, especially non-tenure track faculty.

Hotels in major cities are expensive, flights are expensive, conference fees are expensive.

And then no one comes to your panel

David: Right. But I also think about this in the context of tenure-track and tenured folks at cash-strapped universities. Even if faculty aren’t paying out of pocket, the money has to come from somewhere.

The amount it costs to send one faculty member to the AHA would pay a good part of a grad student stipend for a semester.

Chris: And it ties into what Erin wrote about this week. The invisible work that goes into attending/putting on conferences. We’re supposed to organize panels and commentators, prepare papers, and rehearse them all without any compensation for the labor involved.

Why? I think the answer is in what you wrote for yesterday, that’s the custom and tradition

David: Right. I think at one time, in the pre-internet era (when travel was also more difficult) conferences served a real purpose. And I do think they still can, but not in the form they often take by default.

Chris: Yeah, I mean a Skype call or Google hangout is a lot cheaper and could accomplish largely the same purpose.

This uncompensated labor is somewhat understandable or justifiable if everyone at the conferences were receiving competitive salaries, but it’s not. Instead we ask scholars to put in all this work and forgo compensation, in the hope that one day they might be awarded a stable job.

That’s a situation rife for exploitation

David: I guess that raises a question. Assuming that the major part of conferences in the presentation of work, what is the point? Ideally, it seems that the point is to get feedback in order to develop one’s project. But rarely does one get good feedback.

Chris: Ideally that is the point, but, as we know, the reality is much different.

How often does the Q&A get dominated by questions for the least prepared presenter or take the form of filibustering statement about my own work in the form of a question?

David: I’ve seen the latter a lot.

The former can go either way. I’ve also seen the least-prepared presenter just be ignored

Chris: I’ve seen them get lots of questions because it’s easier to come up with a question to the person who couldn’t bother to come up with a coherent thesis in a low hanging fruit sort of way.

David: Yeah, that’s true. It speaks to a larger problem of people not really critiquing other people’s work on its merits but rather in terms of how it compares to their own projects.

Chris: To follow up on your question of what the point of an academic conference is, if it isn’t to present work and get useful feedback, then what is it?

David: Professional networking?

Seeing friends?

I mean, for me at this point the best part of conferences is seeing all my friends from grad school who are scattered across the country.

Chris: They also serve as a marker, a box to check on the way up the academic ladder, a line in the CV.

David: Yeah, although at some point one wonders just how many conferences you need

Chris: And are they the “right” conferences?

I’ve been thinking a lot about prestige and markers in academia recently, and conferences serve that function. They indicate that if you’ve presented at AHA, OAH, or the other big organizational ones (for me that would be SHEAR or SHA), then you’ve got the right pedigree for potentially joining the academic elite.

They signal that you’re a “serious scholar”

And they work in conjunction with things like publications in the “right” journals, going to a high ranked program, and having an academic interest in the “right” or “hot” fields

I’m really overdoing it with the quotation marks, but I do it to highlight the nebulous nature of these words in the context of higher ed.

David: I think that’s right. But conference committees themselves are quite mercurial.

Like much of everything else.

Chris: Yeah, all the major stepping stones in academia are surrounded by murky traditions and an unwillingness to be open about the process

Conference committees, dissertation committees, exam committees etc.

David: So let me shift the focus here, if I might. I suggested in my piece yesterday that conferences do still serve some function if we rethink them. I’m curious: do you accept that premise?

Is there still value to conferences in 2017? And, if so, what’s needed to realize that value?

Chris: If we’re willing to be honest about their function, then yes, they have a purpose.

David: For you, what’s the primary function? Just CV-building?

Chris: I think their primary function is CV-building and an institutional devotion because they’ve existed forever.

They also serve a social function

I’d like to rethink the idea of the 3-4 papers and a commentator and chair format.

David: Yeah. I agree on that point. And even professional organizations seem to have reached that conclusion.

But not everyone seems to follow the guidance.

Chris: I’d prefer a situation where the papers were pre-circulated and instead of listening to people talk for an hour and a half with 20-30 minutes of discussion, for the whole thing to be discussion.

It could be a guided discussion led by a commentator

David: Yeah, I agree. Pre-circulated papers really enhance a conference.

That was also the main idea behind my suggestion that conferences be smaller, to foster a sense of community among the attendees in the spirit of good discussion.

Chris: And I think my desire dovetails well with your idea for smaller conferences.

There’s more to be gained from discussion than just sitting there passively.

Research on learning and education tells us that passive learning is less effective and lecturing endlessly doesn’t encourage the higher thinking skills that we want to convey. So why is that approach okay at conferences?

Doesn’t almost everyone when asked about their best experiences in grad school say it was the seminars? Where you tear into a book or two over a 3 hour period?

Doesn’t that suggest that we should take a similar approach to the times when we gather together?

David: Exactly. I agree.

Chris: I have a hard time sitting through an entire panel. I start to squirm and get bored and if that’s happening to me, then how many others are in a similar position?

David: Oh I know. I’m terrible about attending them.

Two panels per day tends to be my maximum.

 

Chris: Right and what does that say about the structure and nature of conferences? That the people for whom they are designed can’t stand them.

And most conferences have at least 4 panel time periods per day

And then roundtables or plenary sessions at night

David: Well, it’s significant that, as you said earlier, most panels are empty.

I’ve had as few as three people in the audience.

Chris: That reality lays bare the absurdity of the ideology justifying conferences

If you’re there to get feedback for your work, how’s that possible when there’s only 3 people in the audience?

And yours is not a unique experience

David: And to me that is the absurdity. No one seems to really be serious about getting feedback.

Chris: To go back to what we were discussing before, because mostly they view conferences as a chance to socialize.

And I’m not saying that’s a bad thing, socializing is good.

But let’s stop pretending otherwise.

David: Which is great. I love socializing. But conferences come with a steep price tag for social time.

Couldn’t we all just get a VRBO?

Chris: Exactly.

And you’re not having conference organizers beg you not to use technology because the hotel’s technology is too expensive to rent

Lucky for me, I never had the problem. I never did PowerPoints or anything else.

It’s kind of a problem for a profession that has a problem embracing technology to beg people not to use it.

David: That is another absurdity. So much of our work now involves visuals.

Chris: I guess, I’ll ask the question that has lingered over this entire conversation, will the structure of academic conferences actually change?

We’ve laid out what we think the problems are and some possible solutions, but will they?

David: I think, gradually. Because I think people won’t continue to pay money to attend them. Back to my point that started this. Even TT/tenured faculty won’t be able to get conference travel funded any more.

Chris: I’d agree with that. I think any change would be gradual.

To your point, when TT faculty can’t get funding, that’s when they’ll be in real trouble.

David: Yeah. It’ll be a slow process. Like most things in this profession.

Chris: Tradition and the status quo are powerful anchors.

Even in a profession where we value our critical thinking skills and think of ourselves as a meritocracy, we’re often blind to our own obvious faults.

Advertisements

In Praise of the Academic Conference

Tags

,

This month I’m attending two academic conferences in the span of two weeks. That wouldn’t be particularly remarkable, save for the fact that it marks the end of a nearly two-year long self-imposed moratorium on conference attendance.

After the 2016 meeting of the American Historical Association, I’d had enough of conferences. I’d reached the point of doubting whether they served any professional purpose for me. Academic conferences are expensive: registration typically is around $100. Travel to the host city is likely at least as much. And hotels are often in the ballpark of $200 per night. (I actually really like staying in hotels so spending money on them is less objectionable, but I’d rather spend that money on a vacation). Attending the AHA that year set me back close to $1000 (and that was with my department covering my registration, a luxury many scholars do not enjoy).

Beyond the expense, though, I found the very purpose of conferences to be increasingly elusive. Despite efforts by the AHA and other professional organizations to move away from them, the bread-and-butter element of such gatherings is the traditional panel. These are between 90 minutes and two hours long. Three or four scholars present papers on work in progress, another scholar might offer commentary, and then the audience asks questions (in theory anyway; often audience members just make speeches about their own work, thinly disguised as questions).

height-630-no_border-width-1200

Live from the AHA….

While I’ve heard many intriguing papers, I’ve heard just as many that were under-prepared and under-rehearsed, and which bore little connection to any of the others on the panel. In grad school I was counseled not to worry about my paper. “No one cares what you say,” a senior academic once told me. “It just matters that you present something so you can put it on your CV.” That attitude seems to be one that many folks have taken to heart.

Still, though, despite my feeling that academic conferences are a relic of the early twentieth century, I’ve recently found a new appreciation of them. Done well, they offer opportunities for things that are increasingly rare in academic life:

Non-Social-Media Social Time: I am, I freely admit, a Twitter addict. I tweet constantly. I compulsively check it in the middle of the night. One of the things I most appreciate about Twitter (and, to a lesser extent, Facebook) is the way it’s allowed me to connect to countless colleagues I’ve never met in person (hello, editors of this blog!).

Conferences provide an ideal venue for making in-person connections, and for finally putting a face to the names I encounter on Twitter, on blogs, and in academic journals. Much as we might forget it, there’s value to personal connections. I’ve had far deeper discussions of history and gained many more insights for my scholarship through in-person conversations than through social media.

Sharing In-Progress Work: With the proliferation of academic blogs and online journals, and with the ever-increasing need to produce new content in support of one’s academic brand, historians are called on more and more to produce polished work. Conferences represent a venue where the work need not be polished. Presentations are understood to be works-in-progress (though the growing popularity of live-tweeting academic gatherings has changed this somewhat). This, in turn, allows presenters to ask questions, test arguments, and, ideally, to be less on their guard about their work.

The question, then, is this: what can be done to sustain the benefits of conferences while eliminating the components that seem out-of-date in 2017? I have many thoughts on this, but here are just a few:

Make them Affordable: This one should be obvious. As contingent faculty make up an ever-growing percentage of academic professions, conference organizers can no longer assume that attendees will have fees and travel expenses covered by their institutions. Moving conferences away from convention centers and downtown hotels would lower costs, making attendance easier.

Make them Smaller: Many of the best features of conferences would be enhanced by keeping them small. Workshop-type gatherings, where everyone attends the same panels and hears all the presentations (as opposed to massive conferences that schedule dozens of panels at the same time) not only allow professional connections to develop but also allow sustained academic conversation. (Smaller, more frequent, regional conferences would also have the benefit of being less expensive.)

Don’t Allow Job Interviews: For decades, academic conferences have served as venues for first-round job interview. In the pre-internet era this made sense: they brought together both job-seekers and academic institutions. But now, first-round interviews can be done just as easily with Skype or Google Hangouts. The presence of job interviews adds a certain toxicity to conferences. As someone once remarked to me at an American Historical Association meeting, “you can smell the stress and unhappiness in the air.” Especially in this competitive job market, the environment fostered by the presence of interviews is the antithesis of that which makes for a productive conference.

Far from having outlived their use, academic conferences are perhaps more useful than ever in 2017. But like so much else in academia, they’ve run on tradition and custom for too long.

When you haven’t finished your conference paper…

Tags

,

This week, we’re all talking about academic conferences. I am presenting at one next week. I have not finished the paper for that conference, in part because I was finishing up another article with a colleague that was published yesterday. This is the sort of cluster of deadlines that academics try to avoid so much, but when you’re dealing with outside organizations and news outlets, there’s only so much you can do. In lieu of a fuller post on conferences, I give you a snippet of and link to the article I have been writing.

The only other point I’d make is that much of what academics do is “invisible,” not just because it happens at conferences that are of little interest to the general public, but also because it happens in private, and because it is unpaid. None of this work – the article in the Washington Post, my writing for this blog, the conference paper I’m writing, the book I’m writing – is paid, and much of it depends on research I have to pay for myself. It is still part of my job and I’m expected to do it to get and maintain employment.

In July, an international group of 62 Catholics delivered a challenge to Pope Francis, charging him with propagating heresies in his 2016 suggestion that divorced and remarried Catholics could receive Communion, a sacrament from which they are currently barred.

After they had received no satisfactory response from the pope, they released their “Filial Correction” to the public in September. Since then, the number of signatories has grown to 235.

The incident — Catholics challenging the pope, even accusing him of heresy — no doubt seems shocking. But challenges to papal authority are nothing new in the Catholic Church. Laypeople, theologians and priests have claimed the right to define the nature of Catholicism throughout its 2,000-year history.

Read more here: A group of Catholics has charged Pope Francis with heresy. Here’s why that matters.

 

An Introvert’s Experiences at Academic Conferences

Tags

, , ,

Academia and introversion often go hand-in-hand. Introverts prefer to work alone and engage in highly personalized research. They find socializing in large groups draining, preferring smaller interactions. They hate large gatherings filled with lots of people and small talk in favor of smaller and more intimate conversations. The major academic conferences (AHA, OAH, SHEAR, SHA etc.) were the worst part of academia for me as an introvert.

Graduate conferences, on the other hand, were relatively easy to navigate. They had small numbers of people, generally everyone fit into a large conference room or filled a quarter of an auditorium. The participants and organizers were all graduate students who were similarly anxious and excited about presenting their research. The commenters were generally faculty at the host or other nearby institution. Their comments were critical, but kind. The crowds between the breaks were small, maybe a dozen or so people hanging out in the hallway or walking to the next session. There was simply less to take in and potentially be overwhelmed by.

alphagamma-the-30-best-academic-conferences-in-the-usa-youth-opportunities-academia-education-1021x580

An Introvert’s version of Hell 

My trips to major conferences, however, were often overwhelming. Entering the ballrooms and reception halls for the various receptions were a flashpoint for my anxieties. Like most introverts, I’m highly reactive to new situations, trying to take everything in and assess the situation. At the big receptions, where I knew only a handful of people, I’d latch on to the few people I knew and wouldn’t let go. I had to deliver the same spiel over and over. “I’m from U-Del.” “I study physical confrontations between slaves and whites.” “Yes, I’m enjoying the conference.” “No, I didn’t see that panel.” Having to engage in the same conversation again and again would drain all the energy from me. I’d last as long as I could before the voice in the back of my head demanded that I go back to my room and recharge.

Being in the audience for other panels and sessions was also particularly difficult. I tried to arrive early to stake a seat in the back away from the crowd, but invariably I’d always have to make small talk with a stranger before the panel started. I’m sure the people who struck up conversations could sense my anxiety. Often the rooms would be packed with people, giving me more things to think about and more people to potentially have awkward interactions with. Afterwards I would often replay the conversations in my head and chide myself for my awkwardness. One time I went in to say thank you to a participant on a panel I had organized, but once I realized he was surrounded by a group of friends and I couldn’t immediately get his attention I just left. I’m sure someone noticed my awkward lurching around, only compounding my self-consciousness.

new-yorker

Despite the stereotypes about introverts being shy or awkward, I had no problems delivering my papers in rooms of varying capacities. It was just like giving a lecture. And I loved lecturing. How could I not? I had an audience interested in what I was saying. What can be more personally affirming than that? I would rehearse my papers again and again. I’d read them aloud in the basement of my apartment. I’d time myself, note areas where I stumbled over my words, and tried to limit my use of jargon. I approached every presentation like I was trying to tell a story. On the day of my panel, I would reread my paper incessantly. I’d find a quiet corner of the hotel and scribble down some last-minute changes. Twenty minutes later, I’d change it all back. My presentations always went fine. Thanks to my constant practice I never went over time. The Q&A portions were seldom challenging since I rarely got any questions. Whether that was a testament to the quality or lack thereof of my papers, I’ll never know.

Eventually, I figured out ways to manage my stress and my introversion. I told myself that it was okay to retreat to my room when I needed to. I allowed myself to skip sessions if nothing particularly interested me or if I just needed a break. I took solace in walks alone and leaving the hotel to eat. I would try to get as far away from the conference as I could and disappear into a restaurant with a book or my iPad. All of this, I believe, was essential for me to succeed at conferences. I will never be holding court at the hotel bar. I’ll never be the panel participant inviting strangers to come to his hotel room to watch the big college football game. And that’s okay.

As I’ve travelled more for my non-academic job, I’ve gotten better at dealing with my introversion away from home. I do my research ahead of time and look for places that I might like to go. For me, that’s restaurants and bookstores. Ultimately, I hope that my experiences remind other academic introverts of their own experiences. Maybe this will help some of them navigate conferences as best they can and remind them that there’s nothing wrong with needing to get away from it all for a while.

Slack chat: how do historians build their courses?

Tags

, , ,

Erin Bartram: Several years ago, when I was a grad student teaching at UConn, my roommate commented on how much work I put into writing lectures. She had done her undergraduate work at UConn, so she wasn’t unfamiliar with college courses, or even this specific university. I asked her what she meant, and she said “I guess I never thought about it before. I thought professors just got up and talked from what was in their head when they lectured.”

The work that goes into conceptualizing, planning, and writing the material for a course is largely invisible to the students in the classroom, let alone the general public, and that invisibility matters.

So today we’re going to chat about how academics, specifically historians, do this work, and how the invisibility of the work itself enables exploitation which leads to bad learning experiences.

Chris Bouton: Your anecdote speaks to a very common experience from students. They tend to only think of lecture as a pretty straightforward endeavor. Step 1. Professor provides information. Step 2. Student writes it down. 3. ?????

When in fact, regurgitation of information is a basic skill that isn’t particularly useful in and of itself. Especially in a history classroom. When we have higher goals than just handing out info and having it given back to us. At their most fundamental, lectures are arguments.

Erin:: That’s why it’s important that we don’t think of lecturing itself as regurgitation either

They are themselves sub-arguments of the larger argument of the course, which I think is where we start when planning a course.

Chris: If we’re approaching a course from a macro-perspective, then we start with key themes or arguments. For example, what does freedom mean in American history and how does it change over time? Who defines it, who gets it, and who doesn’t?

Erin:: Let’s start a few steps back from there, even – how do professors get to teach the classes they teach?

Chris: We’re going real macro now.

Erin: I mean, I think it matters because it’s important to know that we don’t just say “I’m going to teach this thing that I’m a specialist in this semester” and that’s what happens

Chris: Absolutely.

Erin:: All courses that are taught have to be approved and in the course catalog – the listing of all the courses that can be taught at the university. Getting a course approved means a professor has to design it, explain its purpose, its objectives, what need it fills in the department and the college, think about who will take it and why, and then get it approved by several layers of bureaucracy, both within the department and at higher levels.

Getting a new course approved is a lot of work, but once a course is in the catalog, it can be taught at any time, but even so, what gets taught when is a thing department think carefully about (or should)

Chris: Some courses are permanent features of the course catalog, they’re offered every semester and taught by different members of the faculty: American History (generally split into 2 courses with a division occurring somewhere around the Civil War/Reconstruction period), Western Civilization etc.

These are introductory courses that fulfill university distribution requirements. Then once you take those courses, generally you can take higher level courses that are more specialized in their focus and are taught by specific faculty and are related to that faculty’s historical specialty.

Erin: Those courses often fill general education requirements, which means lots of students will take them, which means they fill up and justify their existence. There’s nothing inherently useful about them, and many historians say there’s actually little useful about them.

Chris: From a departmental teaching perspective, they’re the least appealing classes so the most accomplished faculty rarely teach them.

Erin: Some universities won’t have prerequisites for upper division history courses, though, which contributes to how faculty teach those courses. That’s the situation I’m in at my position – I can teach a 300 level course on the Civil War, but you don’t have to have taken college-level US history to enroll.

That shaped how I planned the course – what stuff that they might have learned in a 100 level course did I need to teach in this course?

Chris: The teaching responsibility then falls on the rest of the faculty. Generally meaning graduate students or adjuncts.

Erin: The reason introductory courses are often considered least appealing? They’re the hardest to teach. They require professors to master huge amounts of content and, from that, make a big argument without actually teaching all the content itself.

Chris: You have to cover several hundred years of history in 14 weeks, teach some kind of historical thinking, and design assignments that take into account the vastly different types of students you’ll find in the class.

If you don’t care about teaching, you don’t have to do these things. You can just lecture, assign 2 multiple choice tests and a paper.

We’re talking about the difficulties of making a good class

Erin:: Even if you just lecture…that’s still a ton of work

Chris: True, but you only have to do it once and then never change it. I’m just trying to suggest that the good teachers put a lot of work in on top of that.

Erin: Yes. You can never revise it at all, in response to either new scholarship or students not understanding stuff

Thinking about that first step, though, when you’re writing a class for the first time: once you’ve got a sense of the argument you’re making with the class, you have to figure out how to make it. Each topic you cover, lecture you write, and document you assign relates to the others and to the broader course.

You can think of it like inviting people to a wedding. Like, if we invite this person, then we have to invite that family, but if we invite them, what about this other person.

Chris: Right, you’ve started with the broader theme, taking my meaning of freedom as an example. And say the class is the first half of the U.S. survey. Then you have to think about the big moments of the first half of that survey, Colonization, the Revolution, Civil War etc. and how do they fit in? Then think about different groups or social trends–the experience of the poor, women, the enslaved etc. How do they fit into this theme?

Erin: This also means you don’t always teach everything in chronological order, despite it being a history class.

Chris: Yes, chronological order is flexible and it also means that you won’t touch every subject. Teaching a survey is about triage. Figuring what is absolutely essential and starting with that.

Erin: For instance, I don’t teach European colonialism in the first half of the US survey in a strictly chronological way. We do Barbados and the Chesapeake and New Netherlands and Canada and Mexico separately.

Chris: And by approaching the subject thematically, you’ve set up a basis for comparison.

Erin: I do the same with the 19th century, and I think lots of us do. It means things like the Missouri Compromise might show up “later” than you’d think, but that’s only because of how the material is arranged. It then provides us with the opportunity to do comparison work in discussion, rearranging the material and putting it together in new combinations.

And this is why lectures/lessons/documents have to be so carefully plotted: you’re setting up ideas and connections that will bear further fruit weeks and months later

Chris: You’re constructing something from the ground up, trying to make sure all the pieces fit together.

Erin:: And the hardest stuff to write, honestly, is the stuff you’re a specialist in. The idea that I could get up and just wing a 50 minute lecture on the Jacksonian period or women’s history or Catholic history…

Chris: Oh God yes. Teaching World History or Western civ was much easier. Without the insane depth of knowledge in those areas, it’s easier to focus on the broader themes that you’re trying to get across. Okay, I’ve got 50 minutes on the Meiji Restoration, guess I better figure out exactly what I want out of it.

Erin: And when we do this, we’re not just accumulating facts, or drawing on reserves of facts. We’re trying to weigh and synthesize arguments from dozens of books and articles we’ve read. Historians read hundreds of books and then have written/oral examinations on them at the end of their first few years of doctoral study, and most of us start teaching on our own after that. I’m quite sure I’d have done better on my comps if I’d done them after I’d been teaching.

Like, having to synthesize Good Wives, Nasty Wenches, & Anxious Patriarchs for a 100 level class made me understand that book way more. I had to really figure out what I thought about the Revolution – ideological or economic – when I had to teach it.

Chris: Teaching puts these subjects into sharp focus and forces you to take a position on them in a way that comps don’t, even though that’s the point of comps. I don’t think comps are particularly effective at that, but that’s a subject for another day.

Erin: So, there’s one thing that I think is important here. We’ve talked about the increasing reliance on part-time/adjunct labor in academia, and that fact intersects with this conversation in two ways.

Adjuncts are often hired last minute. Like, the weekend before. Even full-time non-tenure track faculty, being at the bottom of the ladder and with fewer protections, can have their course schedule switched at the last minute.

All of the work we’ve talked about here, if you’re going to do it and do it well, takes HOURS and needs considered thought and time.

Chris: In CT, I was hired to teach a US-II survey in late December for a class that began in late January.

Having never taught US-II, I threw myself into course prep, outlining, writing lectures, selecting primary documents etc.

A few days before the semester started I found out that the class was cancelled because it didn’t have enough people in it.

Erin:: And that gets us to the second point: whether the class happens or doesn’t happen (ugh), adjuncts don’t get paid for the hours they spend designing and revising classes.

Chris: I didn’t get paid for a single minute of that course prep and that was time I could have spent on my dissertation.

Erin: You can see here the ways that this is exploitative and bad for students.

Either adjuncts do massive amounts of unpaid work (provided they even have the time/opportunity to do so) to make the course good or they don’t and students get a crummy learning experience.

Chris: And even if they put in the good work and make connections with students, there’s no guarantee they’ll be invited back the next semester.

Students naturally specialize in certain professors, but that’s impossible to do when you’re talking about adjuncts, so the students are missing out.

Erin: And if students are interested in your field, and might become majors, they don’t bother because they don’t know that you – and your (theoretical) specialty upper division classes – will even be there.

Chris: History departments complain about declining enrollments while they rely on adjuncts and grad students to handle more and more of the course load, 71% of all classes are taught by non-tenure track faculty, you don’t think those things are connected?

Erin: And we’re not just “I reckon”-ing over this. I have had students say this to me explicitly.

Chris: And it doesn’t take much searching to find evidence of this from articles in the Chronicle or Inside Higher Ed. So there’s a problem here that compounds itself.

Erin: And that’s how courses are made, boys and girls!

 

“Why don’t historians teach…?”

Tags

, ,

Over the weekend, conservative commentator Dinesh D’Souza dipped into a familiar narrative well: “Did you know the Democrats were the party of the Klan???”

When historians said that we knew the complex history of the Klan and the major political parties, he asked why we weren’t teaching this true fact about how the Democrats were the party of the Klan. When we said we were teaching the complex history of the Klan and the major political parties right now, in classrooms across the country, he said clearly we weren’t or everyone would know this true fact about how the Democrats were the party of the Klan.

He basically challenged Princeton historian Kevin Kruse, who studies the history of race and Christianity in the American South, to some kind of historical duel, and then spent the whole weekend railing against leftist, “progressive” historians.

 

I had a whole post written out about how D’Souza’s “method” of historical interpretation is both invalid and makes no sense (either things change over time or they don’t, and if they don’t, there’s no such thing as history!) but I don’t need to get trolled for all eternity, so instead I’ll make a few points that bear repeating.

Change happens. Context matters. Words mean different things at different times. That’s at the root of historical analysis. D’Souza tries to pretend they don’t, and that’s his whole angle – that historians are playing word games. But simply by virtue of saying one definition of a word from a particular historical moment is the right one, he’s acknowledging the change while trying to deny it any meaning. [Historians do like playing word games in our spare time, though.]

Historians don’t just teach facts, they teach interpretation and analysis of evidence. Some people don’t like that, and call it politics, or intellectual mumbo-jumbo, or political correctness. It’s not. It’s the discipline of history.

When someone says “Historians don’t teach this!” you should consider whether they’re just mad that historians don’t teach the interpretation or narrative they want. When someone says “Historians don’t teach this!” you should also consider whether that person has seen the inside of a history classroom recently.

If all it took to unsettle a popular but incorrect belief about something was a few lectures in high school and college, that would be great. That isn’t the case, though. Do most of us understand probability properly? No, and that’s without it being treated like a political football like the stuff D’Souza’s talking about. Just because someone is talking about the past doesn’t mean they’re doing history. You should consider whether maybe they’re just trying rile people up to sell a book.

If anyone has any questions if/how/why historians teach specific things, we’re happy to answer them.

 

 

 

Academic Work Isn’t Life

Tags

, ,

Within academia there is a belief that academics should spend every possible moment working on their research. The academic, this conception holds, loves what she does and every moment spent working isn’t work at all. The tweet below from Igor Aharonovich, an Australian academic, exemplifies this idea.

Screen Shot 2017-10-09 at 12.17.12 PM

This linking of academic work and individual self-worth, however, promotes unhealthy and destructive attitudes that can lead to mental illness. Aharonovich also endorses an academic exceptionalism that justifies this trade-off as the price for doing what you love—no matter how destructive it may be.

The realities of academics’ lives are more complex than Aharonovich suggests. Many people in academia can’t do their academic work on weekends. They have other responsibilities. Some have to care for children, elderly parents, family members, or partners. Others have to work outside jobs to help pay their bills. Graduate school stipends are pitifully low. Adjuncts and non-tenure track faculty (who account for over 70% of all academic appointments) lack access to health insurance, retirement plans, and stable employment necessitating outside income streams. Despite what Aharonovich suggests, plenty of academics cannot afford to devote themselves solely to their research and writing. It’s the height of arrogance to suggest otherwise.

Aharonovich promotes the psychologically damaging idea that academics should measure their self-worth solely through their work. But what happens to individuals when that academic work doesn’t go well? What happens when that article draft comes back from the journal, only to have been savagely reviewed by two unanimous readers? If you’ve wrapped your entire identity into your work, then those criticisms aren’t just of your work, they’re of you. Your argument lacks evidence. Your thesis is underdeveloped. Why should a reader care about this subject? It means that you’re not interesting enough. You’re not good enough. You start to spiral. What’s wrong with you? Why didn’t you do all those things that seem so clear to the reviewers? Where did you go wrong? Why did you your argument fail so badly? Why did you fail so badly? If you’re not a good academic, then what does that say about you as a person?

These unhealthy attitudes towards the self and the relationship between work and identity contribute to the growing mental health crisis in academia. A recent study in Belgium reports that the approximately one-third of PhD students are at risk of developing depression or other psychiatric disorders. Another study at UC-Berkeley revealed that between 42% and 48% of STEM graduate students were depressed. Additionally, students had little optimism about their career prospects. Many more students experienced some, but not all, symptoms of depression. These mental health problems have implications beyond just the personal lives of graduate students. Happiness and quality of work go hand-in-hand. As Berkeley researcher Galen Pranger points out, “Thirty years of academic research has gone into showing that happiness, apart from being valuable in its own right, is critical for performance on so many levels.” As tenure lines vanish and academic opportunities dry up, this mental health crisis will only get worse.

Aharonovich portrays academia as a unique workplace that allows you do what you love. That argument, however, is a damned dirty lie. Plenty of people outside of academia have rewarding careers doing what they love. They have schedule flexibility, intellectual freedom, and better pay. The efforts of people like Jennifer Polk, Maren Wood, and others who specialize in aiding the transition out of academia have shown that academics can find many of the same freedoms off campus. Arguments like Aharonovich’s discourage academics from questioning the structures of academia that have created this culture of unending work. Departments who admit grad students solely to fill the university’s demand for cheap and expendable labor. Administrators who battle the efforts of adjuncts and grad students to organize for better pay and working conditions, while holding out the few remaining tenure lines as incentive. Aharonovich’s arguments reassure academics that the decisions we’ve made in pursuit of higher education were the right ones. That trading away years of our lives for low pay and uncertain (or non-existent) job futures were all worth it. If we believe that we’re doing what we love, then we won’t question our own actions or the structures that benefitted from and encouraged us to make those decisions in the first place.

For any academics who might be reading this, academia isn’t the only thing that defines you. Make friends. Cultivate your hobbies. Get a pet. Take care of yourselves. Netflix and chill. Eat a pint of Ben and Jerry’s. Go for a walk. And most importantly, work doesn’t equal life. You’re worth more than that.

Slack Chat: Las Vegas

Tags

, , , ,

Chris: On Sunday night, Stephen Paddock of Mequite, Nevada opened fire on a crowd of concert-goers in Las Vegas, killing 58 people and wounded hundreds more. It was the deadliest single-shooter killing in American history. Erin and David both wrote about dealing with the shooting this week and we thought we’d continue the discussion in this week’s slack chat.

You made the point that from a historical perspective we need to acknowledge that deeper culture forces at work.

And I don’t think it helps when we characterize the shooter as a “lone-wolf” or describing the incident as “isolated.”

Those terms make this type of violence seem aberrational rather than endemic of a broader societal problem.

David: Right, I think it’s important to acknowledge the deeper cultural forces that lead to these shootings. But I also think it’s important to acknowledge the deeper cultural forces that keep us from dealing with them. I think there’s some overlap between the two, but they’re not precisely identical.

But your comment about the “lone-wolf” is important. I wrote about how our individualist ethos keeps us from seeing gun violence as a deeper problem. But that ethos also makes it easier to identify shooters as “lone wolves”.

Chris: I was thinking about this a lot this week. How many of these “lone-wolf” types have Americans encountered in the past twenty years of violence?

David: Really none, right?

I mean, every shooter is getting their ideas from somewhere.

Chris: And they’re generally part of some larger organization or group.

David: But even if they’re not — and I’m thinking especially of school shootings — there’s a culture that has fostered the idea that shooting a bunch of people is something that happens.

Chris: And it’s something that’s just the cost of our constitutional rights for gun ownership

To be clear, that is an argument that I oppose vehemently.

David: Yeah, that is a really bizarre argument to me.

Chris: We all agree that there are limits to our freedoms.

You can’t yell fire in a crowded theater is a famous example.

David: What gets me is the shocking hypocrisy, or cluelessness, or both. Matt Bevin’s (the governor of KY) comment that you can’t “regulate evil”. All through history, Americans have attempted to regulate evil. And, in fact, more often than not it’s been Republicans who have been doing the regulating!

Chris: We attempt to regulate evil all the time.

Murder, theft, rape, etc. are all illegal.

Our desire to regulate those behaviors stem from a belief that some actions are right and others are wrong.

And we as a society need to punish the wrong acts as a way to deter them.

David: Right. And beyond that, Republicans are more likely to moralize right/wrong in terms of good and evil.

Chris: To your point, how often do we hear about the abortion being evil?

I also don’t understand the argument that, “well the framers intended to protect gun ownership” so we have to do it too.

That argument strips away all historical context.

David: I really struggle with this. I try to be charitable to people I disagree with politically, but there does seem to be a shocking, willful obliviousness on the part of gun control opponents.

Chris: Arguments over the nature of government and what the government’s role is didn’t freeze in 1789.

I think it speaks to what you wrote about. If we push past the shocking and I would argue a-historic arguments, what do you have underneath?

David: Not much. And I think on some level, they know that.

Chris: What you have is a culture of violence.

A nation born in an act of violence and perpetuated by acts of violence.

Where do we divide our American history surveys? The Civil War, the biggest act of violence in the nation’s history.

David: And a culture that valorizes guns

I’ve been thinking more and more about that over the last few years. At the risk of sounding like an old crank, I do wonder if we need to start rethinking how gun violence is depicted in popular culture.

Chris: That is a symptom, I think, of this culture of violence.

Have you ever fired a gun?

David: Nope.

Chris: The first time I ever fired a real gun was on the Fourth of July about a month after I graduated from college.

My girlfriend and I had driven her car back to her parents’ house in Louisiana after graduation and her father (my now father-in-law) took us to a gun range on July 4.

He explained that he wanted to expose me (a Massachusetts liberal who only ever shot an air rifle at Boy Scout camp) to some of southern culture. He likes to do things like that.

We shot his black powder pistol (he’s a Lewis and Clark enthusiast) which I kind of enjoyed.

Then he took out a handgun and had me shoot it. I was absolutely terrified.

As I stood at the range, I held the handgun and it struck me, this gun was designed for one purpose and one purpose only. To shoot people.

I shot it once and never again. (I’ve seen gone skeet shooting a few times with a shot gun but only at clay discs)

But what really struck me was how this was a family event. The gun range was packed with families, little kids, and the like. All just going out to shoot on July 4th.

And for that, I’m glad I went, just to experience this other culture that I had never experienced.

David: That’s always really striking to me (and, admittedly, really outside my own experience). The way gun culture defines family, recreation, etc.

Chris: Yeah, what struck me was just how normal this all was.

Another example. The people we bought our house from kept a 6 foot tall gun safe in their bedroom. They had hunting trophies mounted in one of the kid’s bedrooms and elsewhere.

I guess the hardest thing for me is how to we get from support for recreational hunting to we must protect the right to buy assault weapons.

David: What’s interesting to me is that, at least what I’ve seen anecdotally, recreational hunters are some of the strongest critics of assault weapons.

Chris: Yeah, I mean I think the issue we’re getting at, is how did we get to a point where even after a shooting like what just happened in Vegas (or any of the times before) where we all know that nothing will change?

While gun ownership is a strong predictor of political affliation, people of both parties support some level of stricter gun regulation.

David: But here’s something: I’ve been struck by the fact that people seem more determined for change this time. More so than after Sandy Hook, even. Maybe I’m wrong, but I’m getting that sense. And I wonder why that is?

Chris: Perhaps, the cumulative effect?

A more mobilized/organized left?

David: Maybe. I also wonder if Obama not being president has something to do with it?

Chris: Gun manufacturers were in the middle of a “Trump slump”

When Obama was president and when they thought Hillary was going to be elected, the NRA went nuts telling people that the government was going to take away their guns.

I had an unpleasant encounter last September in a Target in Shreveport with a guy who was asking random people if they thought the government was going to take away toy guns next.

Recently, the NRA has upped its rhetoric even further, declaring that the left is out to get them.

God, I wish the Left was as organized and efficient as the NRA portrays it.

David: Yeah. I think the left has been slow to recognize that the NRA isn’t just about guns anymore, but is attacking the whole vision of society advocated on the left

Chris: Yeah, there’s a great piece by Heather Cox Richardson today that highlights how the NRA joined the conservative movement in the 1970s and 1980s and became a key player in the culture wars.

David: Yup. I saw that earlier. It was, in part, what prompted the comment.

Chris: She made another point in that piece that I really liked, which is one what we’re all grappling in the Trump era.

We have no idea where we are in the broader narrative of history, so it’s hard to know when the world has turned upside down.

Her point was, yes, the world has turned upside down thanks to this culture of gun violence.

David: Yeah, I like that point as well. And, as you say, it applies more broadly these days. I really look forward to reading the histories of this era for that reason.

Chris: I think about that a lot in terms of the Russia investigation.

I think in later histories it will dominate the Trump era like Watergate does for Nixon. But living in it, it’s something that’s just brewing under the surface.

David: That seems right. And I think the same point applies: we forget what a slow burn Watergate was for a while.

Chris: It really was. The Saturday Night Massacre occurred like 10 months before Nixon resigned.

I was thinking about this over the past week–I’ve become kind of numb to these types of shootings. Not that I don’t feel sympathy for the victims or anything like that. More that they don’t surprise me anymore.

David: Yeah, though Las Vegas stands out to me as different. In a way that Newtown did as well.

Newtown because most of the victims were young children, Las Vegas because of the scale and elaborate nature of the event.

Chris: I was thinking about how many of these types of violent incidents have marked my formative years. I remember the newspaper front page that featured the Waco compound burning. Oklahoma City, the Olympic Park bombing, Columbine, 9-11, Sandy Hook, Orlando, the Boston Marathon Bombing. Not all of those were shootings, but they nonetheless stood out to me as part of a longer historical trend.

David: What’s striking about the shootings is just how easy it is to pull one off in the US. That’s what makes the US different from Europe.

Chris: That’s a point worth stressing

The US is the only Western developed nation where incidents like this happen regularly.

Occam’s Razor tells us why that is.

Those countries strictly regulate access to guns

David: So what’s your prediction: does anything change because of Las Vegas?

Chris: Unfortunately, I don’t think so. I hope something does change.

You?

I guess I’d also ask what you mean by change.

David: Stricter gun control laws is what I guess I had in mind. But since you quite rightly have pushed me to define my terms, I’d also add a more general shift in the cultural conversation about guns.

And I think I’m with you. There might be some token policy change. But I don’t think we’ll see any meaningful shift, other than to continue to see guns become a more partisan issue (Democrats more ardently pro-gun control, GOP more opposed).

Chris: I’d agree on the token policy change. Hopefully it does push a meaningful cultural conversation, but it’s hard to see how guns are going to rise above all the other issues that are dominating the cultural discourse right now.

Hence my skepticism.

David: Yeah. We do seem to be lurching from one thing to another.

Chris: And we all know why that is. It’s big, orange, and has tiny hands.

After Las Vegas: Avoidance in America

Tags

, , , , , ,

In previous posts, I’ve mentioned a course I taught called Panic in America. When I first devised the class, I imagined it as a lighthearted exploration of the weirder moments in U.S. history. What better way to examine the past than through the wacky conspiracy theories and bizarre ideas that had shaped American culture?

It became quickly apparent, though, that my fun romp of a course was more of a tough slog. It proved impossible to consider any cultural panic without recognizing the deeper social woes that it obscured. Trauma from a gruesome war motivated a witch craze that left more than two dozen people dead. Bizarre theories about Catholic priests and nuns fanned the flames of nativism. Seemingly innocuous paranoia about comic books obscured deep anxieties about rapid cultural shifts and the terrifying reality of the atomic age.

In short, it was impossible to examine any period of U.S. history without realizing that phobias, paranoia, and conspiracy theories served to distract from political and social issues that desperately needed to be addressed.

My course came to mind today as I listened to an hour of AM talk radio (I tell myself I do this to get in touch with “the other America,” but really I think I just enjoy self-punishment). The topic of conversation, unsurprisingly, was the shooting rampage in Las Vegas. More specifically, the discussion was about various conspiracy theories explaining the horrific event.

There was no discussion of guns. The fact that the alleged shooter had enough weapons to outfit a small militia seemed inconsequential compared to theories about the money he wired to the Philippines (perhaps being funneled to ISIS via Abu Sayyaf?) or the fact that he had two rooms in the hotel (clearly evidence of multiple shooters!) And these theories were tame compared to Alex Jones’ assertion of a Muslim-Bolshevik-Democrat conspiracy and Pat Robertson’s proclamation of divine comeuppance against Americans who insufficiently support President Trump.

las-vegas-antifa-960x1124

It’s not a proper conspiracy theory in 2017 if there’s not an Antifa connection.

Why talk about real social problems when you can explain them away with conspiracy theories?

There is, I suspect, some part of this impulse that is pure human nature. It’s easier – and more comforting – to blame random, impossible-to-control causes rather than facing up to the tough task of fixing a deep-seated social woe.

It also seems to me, however, that there are particular aspects of American society that make this problem worse in the U.S than elsewhere. I don’t claim any originality in my thinking, nor is this a complete list. But three elements seem especially notable:

For-Profit Media: For the vast majority of U.S. history, news organizations have been commercial enterprises. As my foray into talk radio today illustrated, sensational topics sell. So does simplistic storytelling. It is much easier to speculate about conspiratorial forces at work than it is engage on a culture’s complex relationship with firearms and the seemingly insurmountable political obstacles to enacting gun control.

Hyper-Polarization: Despite efforts by historians like Richard Hofstadter (who, incidentally, famously wrote on the “paranoid style in American politics”) to claim broad consensus in politics for much of U.S. history, the country has, in fact, been sharply divided along partisan lines for the majority of its existence. As we know all too well, polarized partisanship impedes substantive policymaking. And in the absence of the ability to undertake meaningful reform, the temptation arises to look for alternate explanations for social ills.

The Individualist Ethos: Perhaps the most fundamental roadblock is a core aspect of American identity itself: we like to think of ourselves as a nation of rugged, independent, individuals. The suggestion that there might be social problems embedded in U.S. culture is at odds with our values. “Guns don’t kill people,” we are told, “people kill people.” Rather than looking at the all-too-clear pattern of gun-driven carnage and considering that we just might have issues that need addressing politically, we instead rush to examine the individuals involved. Conspiracy theories about individuals — however outlandish they might be — do not challenge our national image. Suggesting that we have somehow failed collectively, however, does force a reckoning with our vision of ourselves.

It is possible to transcend the limitations of our culture and enact meaningful change. But doing so requires that we acknowledge that it’s not just guns that are deeply embedded in American culture. It’s also the expectation that there’s always some explanation, no matter how far-fetched, that explains why bad things happen without forcing us to consider our deep, systemic problems.

Until we do acknowledge those problems to their full extent, a pattern of avoidance three-and-a-half centuries in the making is bound to continue.

Surely we have an obligation to try

Tags

, ,

Right now, it feels like there’s only one thing to put into context, and so I gave it a try. I don’t know that I made much progress.

I made word clouds with all of the post-mass shooting speeches from the Clinton, Bush, and Obama years (lots of uses of the word “community”).

I plugged “thoughts and prayers” into Google N-gram (it does show a spike in the last twenty years).

I used Lexis Nexis to see how often “mass shooting” and “quiet town” showed up in the same article (a whole lot, usually linked by surprise that such a terrible thing could happen in a place like this).

I took screenshots of Google searches that revealed it’s not just The Onion that’s at the point of changing the figures and posting the same story (it eventually went up to 16 speeches, I believe).

mass shootings

I did all this while sitting in the same place I once sat stock still for hours, listening to Connecticut’s NPR reporters piece together the scraps of information coming out of Newtown.

There’s a sense now that nothing ever changes, but looking at some of these speeches, it’s clear that the way we talk about things has changed. Mass shootings, even school shootings, happened well before the Clinton administration, but before we talked about “the worst school shooting since Newtown,” we talked about “since Columbine,” despite a notable school shooting the year before that many people forget until they see the name of the shooter. Even this arbitrary starting point reveals how differently we talked about things twenty years ago.

Clinton’s speeches at Thurston High School in 1998 and at Columbine a year later are remarkably different from more recent speeches, not least because they were directed to students, rather than the community or the nation. Both are full of 1990s fears: violent movies and video games, social isolation, and Marilyn Manson, who, in an absurd twist of fate that could only happen in 2017, was recently injured on stage when two giant prop guns fell on him.

In both speeches, and in his brief remarks to the community in Jonesboro, AR, following the March 1998 shooting at Westside Middle School, Bill Clinton referred to “dark forces” driving young people to commit these terrible crimes. We don’t seem to talk like that anymore, whether because we have different understandings of the dynamics of teenage isolation, mental health, and violent fantasies, or because we’ve given up trying to change people, even young people with plastic minds.

That shift, however, appears in combination with an ever-firmer refusal to do what a nation of laws does to protect its people from danger: pass laws. This has led to truly absurd statements like that from Kentucky governor Matt Bevin, who yesterday said gun regulation was useless in the face of these tragedies because “You can’t regulate evil.” With his commitment to using laws to regulate women who seek abortions and interpretive dance majors, it seems clear the issue here isn’t whether regulation works, but rather over what is evil enough to be regulated. If we can’t change people, and ideology and political donations mean no one in power will ever change laws, it should be no surprise that the tone of speeches has shifted to consolation, resignation, and ¯\_(ツ)_/¯.

In the late 1990s, Clinton emphasized that we now had to face that “it could happen anywhere.” No president needs to mention that anymore, because it feels like it has happened everywhere, and will happen everywhere. I was in Austria when Columbine happened, a junior in high school on a trip abroad; I watched the news reports with my best friend, depending on our mediocre German to try to understand what was happening. I later spearheaded an effort to plant columbine in our high school’s courtyard in memory of those who had died, because what had happened felt singular in some way. The effort seems ludicrous in retrospect.

In the classroom, one of the hardest things is getting students to realize that the world around them is historically-specific, just like the moments in the past that we’re looking at, and didn’t necessarily have to turn out the way it did. Right now, it feels like things couldn’t have ended up differently than this, because our nation just values some rights more than others, and always has. Nothing will ever change.

This is the tricky thing about the past, though. Despite it looking like things could never change, they still changed, in ways that we might like and ways that we might not like. And they don’t just change because, or because of chance, or forces totally outside of our control, they change because of the choices people make. One goal of terror organizations – be they the Klan, Al Qaeda, or those who bomb Planned Parenthood locations – is to change our behavior, make us too afraid to do things, force us to abdicate control over our fortunes to others, make us give up. I certainly don’t stand in judgment of people who changed their behavior in the face of direct threats from these groups. But we want to recognize that groups like this wouldn’t work so hard to get people to give up if human agency and organization wasn’t a powerful thing.

The national debate over gun regulation right now feels so closed that it’s tempting to say it was inevitable, and can never change, at least in the United States. It’s permanently ossified. Reflecting on the last 20 years, even shallowly, reminds us that the debate has changed. It’s changed drastically, in my adult lifetime, and I have to believe it can and will change again.

If we believe historical change has happened (and boy, I hope we do, or a lot of us will be out of our part-time, precarious jobs), we have to believe that it will continue to happen because of the choices that people make. Sure, not everyone has the same range of choices available to them, not everyone can or should have to commit themselves to every cause, and not everyone has the power to make a choice that will change the course of history. The discussion around and politics of around gun violence in this country have changed, and will change, whether you’re involved in them or not. Historical change will happen without you, but that doesn’t mean you have to let it happen without you. Surely, as Barack Obama said in the wake of Sandy Hook, we have an obligation to try.