Personalized Utopia or Orwellian Dystopia?

“Search engines will become more pervasive than they already are, but paradoxically less visible, if you allow them to personalise increasingly your own search experience.” [1]

“If you’re really concerned about technology, however, remember that it has the most potential to be dangerous when you stop seeing it.” [2]

* * * * *

In my last Digital History class, I (very casually) threw out the term “Orwellian dystopia” into the pool of ideas and concepts for potential discussion. I threw it out a bit in jest (because I have a declared weakness for polysyllabic words), but mostly in earnest (because, as indicated in my last post, I have had cause to think about Orwell and 1984 lately). The term isn’t mine of course, but comes out of one of the readings for the week: Phil Bradley’s “Search Engines: Where We Were, Are Now, and Will Ever Be.”

As its title clearly suggests, Bradley’s article traces the evolution of search engines, from their rather crude beginnings when web design wasn’t yet a consideration to their present-day, sophisticated forms, which promise to make our searches more personally relevant than ever before. Ruminating on the potential for search engines to get to know us individually – to the point of recommending events that we (as in you specifically, or me specifically) might wish to attend when visiting a new city or whether the supermarket down the road has the same item you or I want, only cheaper – Bradley makes the point about the pervasity and increasing invisibility of search engines which forms the first of the two opening quotes above. He then wonders if the ways in which users are sitting back, letting the alerting services of search engines bring custom-made information to them – “since the engines can monitor what you do and where you go” – will lead to an “Orwellian dystopia” of sorts. Bradley’s advice for avoiding such a dystopia? “Users will need to consider very carefully,” he writes, “exactly to what extent they let search engines into their lives.”

Bradley’s point about the expansive, yet increasingly invisible, nature of search engines fits nicely with some of the ideas articulated in a blog post of my Digital History prof, Dr. William Turkel (or Bill, as he would wish us to call him). In this post, entitled “Luddism is a Luxury You Can’t Afford” (which, I might add, graciously avoids lambasting latter day Luddites but seeks instead to understand them), Bill considers what it is exactly that neo-Luddites are objecting to when they consider technology. Drawing on Heidegger’s distinction between ready-at-hand and present-at-hand objects, Bill points out that it is the second group that poses problems for those uncomfortable with technology. This is simply because these objects are always visible and mostly intrusive – “something you have to deal with.”

Meanwhile, ready-at-hand things are invisible, unnoticed, and therefore accepted as a natural – rather than a technological – part of existence (the coffee cup, electricity, the iPod even). However, “these invisible and pervasive technologies,” Bill notes, in the same vein as Bradley, “are exactly the ones that humanists should be thinking about…because they have the deepest implications for who and what we are.” The post ends with the words I quoted above, about invisibility being the most potentially “dangerous” aspect of technology.

* * * * *

I have been ruminating lately on the idea of transparency on the Web, and it seems to me that there is a strange sort of tension that exists.

On the one hand, as Kevin Kelly’s talk has shown (discussed in the previous post), users are required to be transparent in cyberspace, if they wish to have any sort of personalization. The Web can’t make recommendations if it doesn’t know you, if your searching patterns, socializing patterns, buying patterns, browsing patterns are not visible.

On the other hand, its very collection of this information and the ways in which it presents you with a set of personalized results are becoming less visible, as Bradley has argued. One might actually put this another way, and say that the Web, search engines included, is becoming more transparent, not in making itself visible, but in making itself invisible. It is becoming so transparent that, like spotlessly clear glass, users cannot see that there is something mediating between them and the information world out there, and so they might be tempted to conclude that the search results they’ve gotten are natural and obvious results showing the most naturally important and self-evident links.

In our last Digital History class, it was Rob, I believe, who brought up the issue of Google search results and the problem that they can be very limiting – without the user realizing that they are limiting. Since, as Dan Cohen has said, Google is the first resource that most students go to for research (at least, the graduate students he polled do), [3] the results it presents may very well define how a subject is understood. The danger, then, is that users won’t be critical of their search results, and how they may be tailored or skewed based on numerous factors, because they don’t even realize that such mediation mechanisms are taking place. Thus, invisibility, as Bill has noted, is a definite problem. And, I have to wonder as well, in terms of research, if personalization is too. Will one’s understanding of World War One, say, or of John A. McDonald, or Joseph Stalin, or Mary Wollstonecraft be influenced by one’s buying patterns? Music interests? Socializing habits? If it’s to be accepted that students start their research using Google, and they are signing into Google’s services when conducting such research, what implications does this have on their understanding of history? On the writing of history?

So to try to tie together the strands of these last two posts: it seems that transparency on the Web, in the search engines that exist, is rooted in their invisibility – such engines are a window to the information world that’s easily mistaken for an unmediated glimpse of the real world itself – while transparency of users means their utter visibility and machine-readability. I agree with Bradley and Bill that we shouldn’t take invisible technologies for granted – they need to be explored, critiqued, and discussed; made visible, in other words – so that users can decide for themselves how far they want to go in the age of personalization.

* * * * *

P.S.

I feel compelled to add that to approach these issues from a questioning stance is not to reject the benefits of search engines, or of personalization, or of the Web, or of technology at all. (I, for one, recently ordered a book from Amazon based on its recommendation – or, more precisely, because its very first recommendation was a book that a friend had already suggested I read; that Amazon was in line with someone who knows me well amazed me enough to purchase the book!) The issue is never simply, I think, technology in and of itself. It is the uses of technology, how new capabilities, especially in the digital age, is going to be employed and what their developments and effects might mean (and have already meant) for individuals and groups in society that is the crucial issue at hand.

To conclude, I think Marshall McLuhan’s classic metaphor about the dual nature of technology – how it both “extends” and “amputates” – is still relevant and instructive today. It seems that in most discussions about technological innovation, we nearly always hear about “extensions” (Kelly did the same thing; though interestingly, he went so far as to reverse McLuhan’s idea, calling humans the extension of the machine) but we rarely hear about “amputations.” Perhaps a balanced approach – one that keeps visible both the advantages and disadvantages of new technologies, that considers their impact in a broad sense, neither blindly fearing them because they are new nor unreservedly embracing them for the same reason – is the way to ensure that we remain critical in the digital age.

_______________________________

[1] Phil Bradley, “Search Engines: Where We Were, Are Now, and Will Ever Be,” Ariadne Magazine 47 (2006), http://www.ariadne.ac.uk/issue47/search-engines/.

[2] William J. Turkel, “Luddism is a Luxury You Can’t Afford,” Digital History Hacks, http://digitalhistoryhacks.blogspot.com/2007/04/luddism-is-luxury-you-cant-afford.html.

[3] Daniel J. Cohen, “The Single Box Humanities Search,” Dan Cohen’s Digital Humanities Blog, http://www.dancohen.org/2006/04/17/the-single-box-humanities-search/.

Advertisements

I’ve come to Western prepared to cast off my Luddite tendencies and to embrace technology wholeheartedly (or, I might add, as wholeheartedly as can be possible for one who has always been rather critical of science & technology, and their impact on human lives).

Last week, in our Public History meeting, I had cause to re-think the matter. I had cause to re-think matter itself, to consider the physical versus the virtual.

As a class, we visited the UWO Medical Artifact collection housed in the basement of the Health Sciences Addition. There our professor, Dr. Michelle Hamilton, gave us a tour of the collection and even bid us to don on white gloves so that we could examine certain objects in particular. This was going to be interesting. My experience with “handling” artifacts has mostly consisted of manipulating images of 3-D objects. Being able to zoom in and out of the image and to rotate it in 360 degrees so as to view the object in a multitude of angles has made me think that perhaps the virtual is sufficient. It’s fast, painless, interactive, accessible, and safe – the artifact’s continued preservation is not at risk. What more could one ask for?

Well, as it turns out, I was reminded about the weight of matter.

Picking up one of various blades belonging to a late 19th century amputation kit, I had a visceral reaction. The blade was labelled “Amputation Saw” and was likely used during the American Civil War. I ran my finger along its grooved ebony handle; examined the engraving that read “A.L. Hernstein, New York”; noticed how the wide blade caught the light and gleamed; stared at its sharp, serrated edge and the brown specks of dried blood against the cold metal – and I shivered. I was thinking keenly about the men with whom this blade had come into contact, and about the military surgeons who had had to use it, and about how it had somehow survived and made its way all the way from the fields of warfare and agony into my hands in the 21st century.

If there is anything that connects us to the past as directly as possible, it is physical matter. As Thomas Schlereth has noted: “Material culture is not only the most ancient of time’s shapes, it is also a tangible form of a past time persisting in present time.” Still later he adds that “to the historical researcher, [artefacts] are here in his time; and yet they are also still there in another time – that is, in their time.”[1] We will never be able to travel back to a moment in history but artifacts are able to travel forward and speak to the next generation. Well, perhaps “speak” is not the right word: I have always been fascinated and frustrated by the silent immensity that objects hold, by their stories that they keep jealously hidden when their owners have long passed away or there is no one to tell them. But silent as they may be at times, artifacts do give us an immediate and mostly unmediated connection to the past which its virtual embodiment is not able to do.

Of course, I still think, like many, that computer technology has made the documents and objects of the past so much more accessible – and in turn, seem so much more varied and abundant – than they have ever been. There will not, perhaps, be a great many who will get to examine that amputation kit as closely as I could last week; but they will be able to see and consider it using the collection’s website. And too, I’d add that in discussions about the virtual and the physical, it does not need to become an either/or debate. Physical objects along with their virtual counterparts enhance the study of history; both help one to come to grips with the past. As Anthony Grafton has put it, in the specific context of books and the ramifications of their increasing digitization, “these streams of data, rich as they are, will illuminate, rather than eliminate, books and prints and manuscripts that only the library can put in front of you.”[2] This is all true.

But nonetheless, I think it is good to be reminded, now and then, in an age where the line between the virtual and physical seems to be blurring, where the virtual has almost as much weight – and even more – as the physical, that matter still does, well, matter.

_______________________________________
[1] Thomas J. Schlereth, “Material Culture and Cultural Research,” in Material Culture: A Research Guide, ed. Thomas J. Schlereth (Lawrence: University Press of Kansas, 1985), 9-10.

[2] Anthony Grafton, “Future Reading: Digitization and its Discontents,” New Yorker.com, November 5, 2007, http://www.newyorker.com/reporting/2007/11/05/071105fa_fact_grafton (accessed September 17, 2008).