Thursday, October 6, 2011

Methods, Shmethods?

What is it that historians do? My generation keeps being bombarded with post-modern theories, we are constantly warned of our biases and made conscious of our narrative structures. Until this past week, I would have mistakenly made the generalization that historians of my age and position have moved beyond this - that we understand that we are not, in fact, looking for the truth, that history is amalgamous and ambiguous and porous. Of course we have biases, of course we all interpret things differently, and of course this comes out in our writing. The very acts of deciding what to include in our writing (let alone our research), how to structure it, and which words we will use all impact our eventual readers. The professor conducting my methodology seminar this week made the point that the title of that day’s seminar, “Theory and History,” falsely implied the two could be separated. You cannot write history without theory, without employing some methods, whether conscious or simply background to which you pay minimal attention. That point seems obvious once made, but it struck me in the moment. And yet, as discussion progressed, and indeed it was actually a continuation of our introductory “Historians and Historiography” session of last week, it became clear that there are a couple people, and definitely one very adamant young man, who still ascribe to the traditional, out-dated “Historians look for and present the truth” maxim. This not only annoyed me, it surprised me. How, in a world where we acknowledge the relevance of individual interpretation and experience, how can one possibly still think that there is a definitive right answer to “what happened”? Yes, of course, one cannot claim complete falsehoods. You cannot go around saying that Robespierre died in 1800, it’s simply not true. But there is no single correct interpretation of the French Revolution, of its various phases, of what the Terror was. There are traditional, conventional, normally agreed-upon versions, bien sûr, but we are not beholden to them. If I wanted to, I could play with the dates, claiming that, actually, the French Revolution began in 1787, when active resistance to Louis began at Versailles, rather than the “normal” 1789 argument. I can claim that the French were actually spurred to action by the residents of the Austrian Netherlands (today’s Belgium), who had begun a serious resistance to Joseph II in 1787 when they refused him his taxes. [For the record, these are all simplifications.] How can this colleague truly continue to think that we historians search for a truth and (more scarily) that we will one day find it?
What frustrated me perhaps more was an incredibly condescending discussion of the difference between “popular” and “academic” history. Words like “the public” get thrown out with derision dripping from the speaker’s mouth, a snotty smile crossing their face as they mention historical fiction or the audacity of that television series to bend dates and simplify ideologies. It’s entertainment, you jerks. Cinematography, publisher demands, and pure time and space sometimes dictate changes a “pure history” would prefer not to make. This does not have to make it useless. Entertainment is fun. Willing suspension of disbelief. Stop over-thinking for a minute and enjoy, damnit. A large part of my class jumped down the throat of popular history, drawing a massive gulf between it and “real” or “academic” history. What’s the difference? Academic history tends to be more technical, drier, more boring, more into minutia, they answer. Why? Dear god, why? I actually brought this up: why do we write “boring” things for academic journals and exciting, more literary-stuff for “the public”? Surely it comes down to style and it would be completely possible to write less dryly for an academic article. No, I was told, that would make my work look less professional. And besides, “the public” does not want to know the origins of a specific Pictish word, they want the romance of William Wallace. Again, why? And who the hell are we to decide? “The public cannot handle the complexities and nuance of real history.” Fuck off. [Clearly, this is a pet peeve of mine.] Museum exhibits tend to be more popular when they confront people with new ideas, new approaches, or contradicting interpretations. [For a powerful example of this, see Eric Foner’s essay on his career as a historian. It’s chapter 1 of his Who Owns History?] People like to be challenged. There is a dignity and a flattery in turning to someone and saying, “What do you think?” This is not to say that professional historians shouldn’t have jobs. Please, I need to be employed someday. But we can certainly engage with “the laymen.” Afterall, history is everyone’s. Everyone has a history, everyone can engage with the past. Ok, everyone probably does not want to sift through the archival material on the Brabant revolutionaries of 1787-1790. But I bet their story would be interesting to someone. Presented in a relatable, exciting way, any history can be captivating. Lists of names and dates are boring, even to historians! They are not history, though, which brings me back to the original question: what do historians do?
A list of the facts, of the empirical evidence, is nothing. It is a chronicle. It has no inherent value. Not until someone picks it up, reads it, and begins to think, does a set of facts become real and important. [Jenkins discusses this a bit in Re-thinking History, to those of you looking for footnotes or references.] The choice of things included and the choice of things excluded are the first elements of importance, of shaping that a historian undertakes. Then there are the inferences drawn from the list. Will I present it as a list? Will I give explanations of each thing? If I do, what kind of language will I use? “Democratic” takes on many meanings. I may write it with one intention and my audience read it with another. Historians interpret. We give voice to what we find in the archives, and that voice is none but our own. We can pretend that we are “objective,” as I’m sure my particular colleague would, but we are not. We never will be. We are human beings, with feelings, thoughts, prejudices, and value systems. I’m currently reading R. R. Palmer’s Age of Democratic Revolution (1959) and it is so dated, it is laughable. He speaks of parallels between the 18th and 20th centuries and their “revolutions” - the 20th being communism. He makes moral judgments on both, coming down hard on the Western side in opposition to communism. He reminds his readers that, just because we do not like revolution now (in the 50s) and the implications of what happened in Russia are mostly negative to his audience, doesn’t mean we should write off the 18th century revolutions as inherently negative as well. Now, that last point does make sense, but to be preached at about the evils of communism is quite annoying when reading about the Atlantic world of the 1790s. And yet, his example proves my point (which, for the footnote enthusiasts, is again closely related to some Jenkins): no historian can be taken out of their own context. Palmer’s book will undoubtedly paint the Jacobins and more leftist revolutionaries as extreme, and I will have to account for that as I read, but that does not make him wrong or irrelevant. It makes him dated, certainly, but his is only one available interpretation and anyone reading it is free to agree or disagree as they see fit. This should be obvious, I lament again. Of course everyone has an opinion. Somehow, though, historians have come to be seen as some kinds of scientific experts, giving the world the “right” versions of past events. 
When are we going to get to a point where everyone knows to read with a grain of salt, to take people’s backgrounds and ideologies and interests into account? When are we going to move past needing rights and wrongs and into a world of maybes and gray areas? When are we going to enjoy the debate for its own sake and revel in the availability of different points of view? When, when, when?! 
When I get my PhD and get out there into the real world, that’s when. [She writes arrogantly.]
NB: The above is all my own interpretation, shaped by my own interests, history, current events, and world view. You are free to disagree.

3 comments:

  1. I'd like to add an "and when"...and when are we going to stop self-flagellating with the subjective cat o' nine tails? When does insisting that humanity is categorically and obviously subjective (and therefore incapable of perfection) stop being revelatory and empowering and become incapacitating? You know I'm with you on the arrogance of academic history's pseudo-scientific method, and the delusions of historian truth-seekers, but can we also acknowledge that constantly recognizing our own inescapable subjective stance through conscientious theorizing is paralyzing?

    I still agree with Frank...you know how I write history? I do the research and write about what I find, as fairly and accurately as I can. BUT WHY THE HELL IS THAT NOT GOOD ENOUGH?

    ReplyDelete
  2. Wow! A great blog. I agree with you wholeheartedly, as one of the "amateur, general-public" historians. And for the record,I find knowing the origins of a specific Pictish word not only interesting but fascinating and elucidating to the context in which I may find it!
    Mom

    ReplyDelete
  3. I do concur. Completely. In fact, I think we're saying similar things differently (surprise, surprise). My comment about bias and being human beings was meant to be a positive - it's inescapable but it's also a good thing. We would be soulless automatons without our "biases," which really should just no longer be the word we use. I prefer perspective. Much less loaded, eh?

    P.S. I heartily wish you were in the class to commiserate with me.

    ReplyDelete