“Would a complete chronicle of everything that ever happened eliminate the need to write history?” — St Andrews final exam question in mediaeval history, 1981

“To give an accurate and exhaustive account of that period would need a far less brilliant pen than mine” — Max Beerbohm

* * * * *

About a year ago, I took a short creative non-fiction course on the topic of writing historical narratives for a general audience. The instructor, Dr. Richard Mackie, emailed the class the above quotes, to stimulate thoughtful reflection about the nature of history and historical writing. (The first quote was actually a question that Dr. Mackie himself encountered as a History student at St. Andrews in the 80s.) These quotes have come to mind lately as I’ve been ruminating about the implications of doing history in a digital age.

The era of the Internet has, I think, made the idea of a “complete chronicle” of our current times more conceivable than ever before. The Web has certainly made it possible for virtually anyone, irrespective of gender, class, ethnicity, etc., to share their thoughts, ideas, photos, videos, even “statuses” (i.e. what one is doing at a precise moment in time) continuously. Provided that all of this electronic data is adequately preserved, there is going to be a vast abundance of information available for anyone a generation or two (or more) down the road who is curious about the interests, opinions, tastes, preoccupations, etc. of ordinary people in our time.

Yet, such information, no matter how detailed, is not the same as history. The chronicling of people’s lives, even on as minute a level as that expressed in an article about “lifelogging” by New Yorker writer, Alec Wilkinson, [1] results only in the production of information. It is the interpretation of that information, the piecing together of disparate parts into a coherent and (hopefully) elegant narrative which pulls out (or, more accurately, constructs) themes and patterns, that transforms it into history, into a meaningful story about the past.

What’s interesting, of course, is that although no historian (I think) would ever claim to write the history on any subject, discussions about the potential of history in the digital age has sometimes suggested the ability for history to be more complete than ever before. The idea of hypertextual history, for instance, where readers of a historical account can click on links leading them to pertinent primary source documents on the topic, say, or to other similar or divergent viewpoints about the particular subject they’re examining, has almost a decentring impact at the same time that it provides more information. It can be easy for readers, I think, to be overwhelmed by the profusion of hyperlinks within a text, and perhaps to never finish reading the actual article to learn the historian’s particular approach to the past.

The beauty of history, I think, is not that it claims to be a complete, exhaustive chronicle that leaves no stone unturned in its examination, but that it presents one angle on the past, a new way of understanding something that is extraordinarily complex and, for that reason, is open to — and I’d even say requires — multiple interpretations. History is, after all, a story as opposed to a record book, a narrative as opposed to mere facts.
______________________________

[1] Wilkinson’s interesting article recounts how computer guru Gordon Bell has been involved in a “lifelogging” experiment, in which he wears a Microsoft-developed device called a SenseCam around his neck that takes continual pictures of his day-to-day experience and allows him to record his thoughts at any given point in time if he so wishes. According to Wilkinson, Bell “collects the daily minutiae of his life so emphatically that he owns the most extensive and unwieldy personal archive of its kind in the world.” Alec Wilkinson, “Remember This? A Project to Record Everything We Do in Life,” The New Yorker.com, May 28, 2007, http://www.newyorker.com/reporting/2007/05/28/070528fa_fact_wilkinson.

I like words. I like their abundance. Their variety. The different nuances that they contain. I like how a word like “melancholy,” for example, has a slightly different flavour than “forlorn,” or how the word “myth” conveys something deeper – more emotional, more enduring, more fluid – than its neutral variant, “story.”

Now, most of my Public History peers would probably say that I don’t just like words – I like polysyllabic words. I’ll be the first to admit that long-ish and rather atypical words have a tendency to come to me, unbidden, and that I have an equal tendency to utter them aloud, without thinking. This penchant for the polysyllabic has sometimes even gotten me into trouble, won me unintended notoriety among my peers. 😉

As someone studying in the field of Public History, I realize that I have to think especially carefully about the words that I use, not only because using the “wrong” word can alienate a general audience but also because choosing the “right” word involves weighing various needs, such as those of the client and of the audience, as well as my own for precise language and “good” history. So, while I might immediately prefer the word “auspicious” over “lucky” or feel that “appropriated” explains a situation more clearly than “took,” I’m learning to pause and reconsider the effects of my instinctive word choices, and I’m learning to negotiate the sometimes conflicting needs and desires that exist at the micro-level of diction.

What I didn’t expect was to have to consider the machine as an audience as well. And yet that is what Dan Cohen’s blog post has drawn to my attention. In “When Machines are the Audience,” Cohen suggests that the age of machine-readable text means that we’ll need to write a little differently – or at least with the awareness that what we write can be more, or less, visible in a digital environment depending on the words that we use. The implication is that because text can be read by machines and mined using keyword searches, it is better to use precise and even unique terms or tags in related documents, so that the writing can be more easily searched, grouped, or retrieved. Cohen mentions, for example, how coming up with a unique string of words to identify related history websites can facilitate the process of narrowing searches to these sites only, so that relevant research information can be found. He also cites the example of a legitimate, history-related email being marked as spam, because of its unfortunate use of certain words that are high on the list of favorites for spammers. [1]

Looking over the words I used in a recent email to a friend, I’ll confess that terms ranging from “usurp” and “misnomer” to “vignette” and “quadratic” (yes, as in the quadratic equation) made their way in. (You are probably feeling some pity for my friend right now.) However, I’m consoled, and slightly amused, by the fact that what stands out as atypical word usage is precisely what spam filters ignore. At least, in this area, my penchant for the polysyllabic – for what’s seen as the atypical – has some redemptive purpose. 🙂
________________________________

[1] Daniel J. Cohen, “When Machines are the Audience,” Dan Cohen’s Digital Humanities Blog, http://www.dancohen.org/blog/posts/when_machines_are_the_audience.

It’s funny how things work out sometimes…

I was drawn to the Public History program at Western in part because I thought that a program that required students to actively reflect on the process of becoming practitioners of history was supremely interesting and right up my alley. I like to think about process, and I identify with what E.M. Forster once wrote, and that is: “How can I know what I think till I see what I say?” Writing is for me (as for others, I imagine) a process of not only refining my thoughts on a particular topic but also discovering them. (As an undergraduate, I waited for theses to emerge in my papers — sometimes, I confess, near the end of the drafting process — with as much apprehension as a 10-year-old staring at a Magic 8-Ball after she’s asked some all-important life question…like: Is Tyler really the one I’m going to marry?)

…and yet, here I am, “slogging” (both in the literal sense and as a spin-off to Andrew’s apt amalgamation) my way through my first introductory post, after having spent a ridiculous amount of time thinking up a two-and-a-half word blog title.

Perhaps it’s because I’m a little distracted by all the thoughts and opinions about academic blogging that I’ve read this past week. The good, the bad, and the in-between. I wonder who (besides my Public History colleagues) will actually read this blog and for whom exactly I’m writing. I wonder if I’ll have anything original to add in an already proliferating blogosphere. (I appreciated Matthew’s comment about the burden that originality poses.) And I wonder if my posts will deteriorate by mid-November into navel-gazing comments about breakfast (hopefully not, as I tend to skip it).

Mostly, I wonder how I will navigate the rather alien terrain of the digital world, from blogging to web-building, when, as an appreciator of old things, I’ve mostly not kept up with the new. (I just sent my first ever text message last week; the iPhone, frankly, scares me.) Nevertheless, I hope somehow to be able to speak what Manan Ahmed calls “future-ese,” to be able to learn (some of) the language of the programmer over the course of this year so that I can begin to “re-imagine“, as Ahmed has exhorted, the old in new ways. I’m excited, if duly daunted, by the prospects.

Now where’s a Magic 8-Ball when one needs it?