“I hope somehow to be able to speak what Manan Ahmed calls “future-ese,” to be able to learn (some of) the language of the programmer over the course of this year so that I can begin to “re-imagine“, as Ahmed has exhorted, the old in new ways. I’m excited, if duly daunted, by the prospects.” ~ Quoted from my first blog post, 10 September 2008.

* * * * *
If I ever meet Manan Ahmed, whose Polyglot Manifestos I and II were two of the very first assigned readings for our Digital History class, I would let him know that, like any effective manifesto, his inspired me to take a certain course of action this year – to sign up for the role of Programmer for the digital exhibit that the class would be preparing on the work of Dr. William Harvey.

Incidentally, if I ever did meet Manan Ahmed, I would also casually let him know that I hold him entirely responsible for the sleepless nights I had, agonizing over the code for the program I was attempting to write for an interactive exhibit on Harvey.

I might add here that I knew as much about programming as I did about APIs and mashups prior to this year, which is to say, nada.

(Accusatory) jesting aside, I’ve been reflecting on what it has been like to learn programming from scratch over the course of this school year. I was inspired, as mentioned, by Ahmed’s call for the historian to be more than simply a scholar submerged in past-ese without regard for how their studies might be made relevant to a modern audience (i.e. in present-ese) or how it might be re-imagined in the age of mass-digitization (i.e. in future-ese). How compelling was his call for historians to be “socially-engaged scholar[s],” how apt his challenge for us to become polyglots – master “togglers,” if you will, between past-ese, present-ese, and future-ese – apt especially to those of us with public history ambitions, who had entered the program interested in communicating the past to a general audience in new (i.e. digital) ways. [1]

“All that is required,” Ahmed wrote simply (alas, too simply), as a directive for historians willing to venture into the programmer’s world, “is to expand our reading a bit.” [2]

After my eight-month foray into programming, the words “all” and “a bit” in Ahmed’s above statement strike me as just a tad bit understated. I agree that reading was certainly a major part of my process to learn how to program this year: I pored over, highlighted, marked up, and even wrote conversational notes to the authors of my text (such as the occasional “not clear!”). But I think Ahmed might have also mentioned that not only reading but practicing, experimenting, fumbling, failing, and, yes, even agonizing, are all part of the process of learning how to speak some of the programmer’s language.

Like immersion into any new language, programming has its own set of daunting rules to absorb; break any one of them and you won’t be understood – at all. The program simply won’t run. (I don’t know how many error messages I gnashed my teeth at.) As well, like any language, there is always more than one way to say the same thing – and some of them are more “logical,” “eloquent,” or just plain clearer than others; concision and verbosity, I’ve learned, apply equally in the programmer’s world as they do in the writer’s. (I’ve also observed that my tendency to be wordy applies equally too in the world of code. In fact, I was delighted to learn about the concept of iteration, where lines of repetitive code could be magically – well, okay, mathematically – reduced to a few simple lines, using variables and a certain formula. If only paring down written text were so easy!)

Needless to say, I found the immersion into the programmer’s language very challenging. It was challenging (and, I will admit, even harrowing at times) because not only was I trying to accumulate basic knowledge of the new language, I was also brainstorming ideas for an interactive exhibit on Harvey at the same time. In some ways, it felt like I was trying to devise a Shakespearean sonnet in Chinese or with the vocabulary of a second grader (which is pretty much the extent of my vocabulary in Chinese). All I could envision was something rudimentary at best.

It was challenging to design an exhibit as I was learning the new language simply because I did not know if the ideas that I or others had were actually possible, or, more precisely, would be actually possible for me to learn how to do within the time limit. (I also discovered a humorous difference between the kinds of ideas thrown out by those in Programming versus those in non-Programming roles; the “anything is possible” optimism that technology seems to inspire was not so readily exhibited by those of us who had confronted, and would still have to confront, the befuddling intricacies of code.)

Despite all the challenges, uncertainties, and yes, even secret fears that the particular interactive exhibit I was working on might not come to fruition, things worked out. We hosted our Digital Exhibit on Harvey in early April; all programs functioned; no computers crashed (thank goodness). Looking back to September and my reasons for deciding to learn how to program, I think I am glad, after all, that Ahmed had made it sound so simple. With just a bit of reading, he had written coaxingly, the socially-conscious scholar will be well on his or her way to programming, to filling that gap between the public and the past, and between computer scientists and the future of history. If he had spelled out all the emotions one was apt to go through when learning how to program, I’d probably not have taken it on and thus have missed out on learning to speak a new language, on learning to speak in code.

____________________________

[1] Manan Ahmed, “The Polyglot Manifesto I,” Chapati Mystery, http://www.chapatimystery.com/archives/univercity/the_polyglot_manifesto_i.html.

[2] Manan Ahmed, “The Polyglot Manifesto II,” Chapati Mystery, http://www.chapatimystery.com/archives/univercity/the_polyglot_manifesto_ii.html.

Advertisements

Sometimes, I wonder what historians of the future are going to be writing about, when they examine the early twenty-first century. No doubt, the term “digital revolution” is going to creep in to more than one monograph of the future about our present-day times. Cultural historians (if cultural history is still in vogue) might also, I think, take some delight in tracing the ways in which Google has entered into modern consciousness. Perhaps they’ll trace the moment when Google ceased to be only a proper noun, when the phrase “Let’s google it!” first appeared, and then flourished, in popular discourse. Or maybe they’ll explore the ways in which Google has become a part of popular culture and everyday life, to the point of inspiring satirical responses expressed in, you guessed it, digital ways.

Here are some anecdotes to help that future cultural historian.

* * * * *

Awhile ago, a friend told me an amusing story about how the father of one of her friends was confused about the nature of the Internet. He had never used it before (yes, there are still such folks), and he didn’t quite know what it was all about. So, one day, he asked his son to explain, framing his question according to the only term that he was familiar with – or had heard often enough: “Is Google,” he asked innocently, “the Internet?” The son choked back a gasp of unholy laughter, and proceeded to explain the phenomenon of the Internet to his father. However, if he had simplified his response, if he had said that Google was, in a way, the Internet, he may not have been all that wrong.

* * * * *

During Christmas dinner with my family this past winter, Google (of all topics) entered into our conversation. I don’t remember how exactly. All I recall is that my mom, who (yes, it’s true) had never heard of Google before, perked up when she heard the term at the dinner table, probably because of its odd sound. “Google?” she said, brows furrowed, “what is Google?” To that, my dad, without missing a beat, responded (in Chinese) that Google “is the big brother of the Internet.” Now, “big brother” (or “dai lo”) in Cantonese, when used in a figurative sense, simply means someone who is to be respected, some important or dominant figure or force. But I couldn’t help laughing at the Orwellian overtones that my father’s comment had unwittingly implied. He had meant big brother; I, of course, had heard Big Brother, Chinese-style.

* * * * *

Back in September, Dr. Don Spanner, my archival sciences professor, showed the class a video clip called Epic 2015. Its opening lines were captivatingly ambiguous: “It is the best of times,” said the solemn narrator, “It is the worst of times.” We were entranced by the video’s fictitious yet somewhat chilling projection of the world in 2015, which involved no less than the merging of two powerful companies (Google and Amazon) to become Googlezon, an entity whose information-making and dissemination power had reduced even the might of the New York Times. At the end of the clip, Don joked that the first time he watched it, he just wanted to sit in a corner and stare at paper for a long, long time. We all laughed – and, perhaps, shivered inside a bit too.

Subsequently, I mentioned the clip to a friend, remarking how it was so interesting to see just how big Google had become, as evidenced by the fact that it was inspiring such responses as Epic 2015 with its subtle questioning of the Google empire and its cultural hegemony. My friend in turn enlightened me further about other similar responses. He asked if I had ever heard of “The Googling”? I hadn’t. So he emailed me links to several clips on YouTube, which explore Google’s services (such as their mapping devices) in a new – and, of course, hilariously sinister – way. To view them…simply google “The Googling.” 🙂 (There are five parts.)

* * * * *

To the cultural historian of the future:

It was true. Google was (is?) ubiquitous, to the point that it entered into dinner table conversations and was mistaken (or correctly identified?) for the Internet. Even to the point of inspiring satirical YouTube clips and prophetic visions of a Google-ized world. That is, of course, when you know something is big – when it becomes the subject of cultural humour and unease, negotiated and even resisted in satirical ways.

So, we embraced Google even while scrutinizing it at arm’s length. We questioned Google even while googling. It’s what we did in the early twenty-first century.

“Predicting the Next 5000 Days of the Web”

Recently, a friend of mine sent me a link to a video on TED.com. In this video (which, if you have 20 minutes to spare, is highly worth checking out), the presenter, Kevin Kelly, traces the Web’s development over the last 5000 days of its existence. Calling it humanity’s most reliable machine, Kelly compares the Web, in its size and complexity, to the human brain – with the notable difference being, of course, that the former, and not the latter, is doubling in power every two years. As he sees it, this machine is going to “exceed humanity in processing power” by 2040.

Not only does Kelly look back on the last 5000 days, he also projects forward, and considers what the next 5000 days will bring in the Web’s evolution. What he envisions is that the Web will become a single global construct – “the One” he calls it, for lack of a better term – and that all devices – cell phones, iPods, computers, etc. – will look into the One, will be portals into this single Machine.

The Web, Kelly states (pretty calmly, I might add), will own everything; no bits will reside outside of it. Everything will be connected to it because everything will have some aspect of the digital built into it that will allow it to be “machine-readable.” “Every item and artifact,” he envisions, “will have embedded in it some little sliver of webness and connection.” So a pair of shoes, for instance, might be thought of as “a chip with heels” and a car as “a chip with wheels.” No longer a virtual environment of linked pages, the Web, in Kelly’s conception, will become a space in which actual items, physical things, can be linked to and will find their representation on the Web. He calls this entity the Internet of Things.

We’re not, of course, quite at that stage yet, but we are, according to Kelly, entering into the era of linked data, where not only web pages are being linked, but specific information, ideas, words, nouns even, are being connected. One example is the social site which allows a person to construct an elaborate social network online. From Kelly’s perspective, all this data – the social connections and relationships that we each have – should not have to be re-entered from one site to the next; you should just have to convey it once. The Web, he says, should know you and who all your friends are – “that’s what you want” he states (again, quite matter-of-factly) – and that is where he sees things moving: that the Web should know and remember all this data about each of us, at a personal level.

In this new world of shared data and linked things, where (as I have pondered in a previous post) the line between the virtual and the physical is no longer an identifiable line, Kelly sees one important implication: co-dependency.

The Web, he says, will be ubiquitous, and, in fact, for him, “the closer it is, the better.” Of course, in knowing us personally, in anticipating our needs, the Web exacts a price: “Total personalization in this new world,” Kelly concedes, “will require total transparency.” But this is not a hefty price for Kelly, who, it seems, would prefer to ask Google to tell him vital information (like his phone number for instance) rather than try to remember such things himself.

This sheer dependency on the web is not a frightening prospect for Kelly; he compares it to our dependency on the alphabet, something we cannot imagine ourselves without. In the same way, he argues, we won’t be able to imagine ourselves without this Machine, a machine which is at once going to be smarter than any one of us, but is going to (somehow) be a reflection of all of us.

Kelly ends his talk with a “to do” list, which, for its tone alone, needs to be repeated:

There is only One machine.
The Web is its OS.
All screens look into the One.
No bits will live outside the web.
To share is to gain.
Let the One read it.
The One is us.

I will confess that when I finished watching the video, I shuddered a little. I found Kelly’s predictions to be fascinating but also pretty unsettling. (And I’ll admit: it made me think of The Matrix and 1984 at various points.) My reaction, I suppose, stems from a discomfort with anything that smacks of…centralization, of one big global entity, so the concept of One Machine that owns everything and connects everyone, One Machine which is both us, and yet a bigger and smarter and better us, is simply disconcerting.

I can admit my own biases. As a History student, I’ve examined enough regimes over the years to be wary of certain rhetoric and have noticed how, at times, things framed in terms of unity and community and sharing (in this case, of data, networks, knowledge, etc.) can devolve into something that becomes a cultural hegemony of sorts, or worse, a system of pervasive surveillance. (Kelly did, after all, mention shoes as “a chip with heels” and cars as a “chip with wheels,” developments which can certainly aid individuals in staying connected to one another, for instance, but can also aid the State in staying connected to individuals.)

The embedding of the digital into everything in the material world, including people, so that it can be read by one machine, is an unsettling notion. I guess I don’t prefer to have everything personalized, to have a single, networked, global machine that reads me and knows everything and everyone connected to me. It may be convenient – and it seems that convenience, along with speed, are the two guiding principles behind technological developments – but I’d rather not be so transparent in cyberspace, to be so utterly “Machine-readable.”

Putting aside my own personal reaction to Kelly’s predictions, what are the implications of his thoughts concerning the Web’s evolution on the discipline of history? If realized, the new world of linked things will certainly demand, as some digital historians have already noted, a re-thinking of how history is researched and presented. The possibilities for new connections to be made – not only between data and ideas, but between actual things, artifacts – are, I’m sure, going to change and open up the ways in which history is understood, communicated, and taught, especially by and to the public.

I can imagine already, down the road, that someone interested, say, in the history and development of the camera, might be able to pull up, in a split-second search, every single camera made in the 20th century that is owned and catalogued by all the museums that have taken the trouble to embed digital info into these objects, which then makes them linkable, “machine-friendly”. Artifacts made in the 21st century that already contain that “sliver of webness and connection” will be even more easy to search for and pull up (for the 22nd century historian let’s say), existing, as it were, already as digital-physical hybrids. The ease with which actual things can be connected and thus compared, regardless of their geographical location, is going to make for some interesting comparative histories.

So, the possibility for new engagements with history is there (as it ever is, I think, with the emergence of new technologies). I only wonder how one is going to possibly keep up with all the changes, with the Web and its continual evolution.