The year is 1959, and as a graduate student in history, I’m currently studying the approaches that are emerging from the Annales School in France. My doctoral supervisor is familiar with the topic and uses similar approaches in her own work. As we discuss the themes and recent publications, she asks me to write regular papers that reflect on what I’ve learned and how professional history is changing. Throughout the term, I write and submit the papers, which receive comments and are returned to me.
After one particularly engaging discussion, I decide that I’d like to send copies of my paper to a family member and a friend who might enjoy the topic. Thankfully, I’m attentive to technological innovations, and note that Haloid Xerox recently unveiled their latest, cheapest paper copier, the 914, and that my university managed to acquire one from the first production order. I visit the university printing office, pay a processing fee, and hand over my typewritten paper. In a short while, I have two copies, which I send to Canada and England.
A week later, I speak on the telephone to my family member about the content of my paper, and note a few things to remember in my future writing. Two weeks after that conversation, I receive a letter from my friend in England, who has carefully reviewed my paper and written a response. Her remarks are similar to my kin’s, which I mention in my next phone call. By the time I can offer a reply to her letter, nearly a month has passed since I first copied the paper.
Now we step out of that scenario. Today, when I finish this paper, I’ll copy and paste it into a post on my website, powered by a WordPress blogging tool. I’ll add hypertext links to replace conventional footnotes, assign a title, add tags, categorize the post, and publish. Within minutes, I’ll tweet about the post on Twitter and post an update on Facebook. Friends, family, and other scholars will be able to read, print, copy, and comment on this paper before I have a chance to move on to the next task. In some cases, my readers might respond to one another and carry on digitally a conversation begun in an office. More importantly, those interactions require only an internet connection, a web-capable device, and a desire to participate.
In the final session of my digital history reading series, I read three scholars primarily concerned with describing and analyzing the social changes resulting from advances in technology. Yochai Benkler uses the term “networked information economy” when describing the impact of information technology in cultures, economics, and societies. Cass Sunstein focuses on the aggregation of knowledge from many minds into deliberative (or predictive) groups that work to make effective, efficient decisions or judgements. In his two complementary books, Clay Shirky examines the changes to organizations, consumption, and collaboration made possible in the digital age. Each of these authors agree on some basic principles:
- Technology lowers barriers to participation and improves access to information
- People will participate for numerous and unpredictable reasons
Each author traces those ideas along their particular topics to identify the changes that are occurring throughout society. Benkler suggests that culture becomes more transparent and malleable when more people participate in its production. Sunstein argues that groups make better, more accurate decisions or judgements when they can access the knowledge held by each member, even if that information is collected only through markets or betting. Shirky identifies a significant shift away from the media consumer culture of postwar decades toward participatory and collaborative media interactions. Some of their conclusions seem apparent to those who study digital technology in any field, but each study tells a slightly different part of the larger narrative that documents the shifting culture of the digital age.
For many professions, including historians, these observations can have ambiguous meaning for the future. In some cases, scarcity of information and limited access formed the basis of practice. Historians represent the past by writing histories that refer to material most people cannot access. The skills and expert knowledge that historians acquire in university are often focused on leveraging that access to fill a gap in the historical narrative. As digital repositories and virtual libraries create free, public access to documents and artefacts, historians face an internal challenge: how should we adapt? I don’t mean to suggest that historians only value is uncovering and representing the past, but a large part of the work, training, and valuation of historians centres on that idea, which is slipping away. When people can access, critique, and interpret documents and artefacts for themselves (albeit without the expert eye of a historian), how should historians justify their careers and work?
Perhaps a slight but significant change in the conceptual framework of history will provide a hinge on which to pivot the profession. The scarcity and inaccessibility of information in previous decades meant that historians were justified primarily by what they did. They alone wrote scholarly, peer-reviewed, and (presumably) accurate historical narratives. But everything has changed. We live in a world where anyone can access the sources, write a narrative, publish their work, and solicit comments from other people. The results might not be as scholarly or accurate, but what historians do can now be done by anyone. Put another way, anyone can remove a tooth, but you would only trust a dentist with yours. So what would it mean if historians focused on how they do history?
Refocusing on how we do history will require that we think carefully about the ways in which historians differ from anyone else who might write historical narratives. In the first post in this series, I suggested that interdisciplinary collaboration might be made easier through “a shared system of values.” Although many possible examples exist, most professional disciplines in the academy value “inquiry, critical engagement, peer-review, and original contribution.” If we use those and other values when describing how history should be written, we need not fear the open network of information available in our digital age.
We must not, however, seek a policy of exclusion. In the deliberative processes that Sunstein describes, experts often play an important role, but he also acknowledges that experts are wrong sometimes. Entrenching against the tide and reiterating our authority is certainly an option, but it would likely fail as people recognized that we need them much more than they need us. Historians provide a service to humanity. In the digital age, humanity no longer needs explorers, but interpreters and guides. Lewis and Clark would be redundant on the interstate highways, but GPS is ubiquitous; all this because the barriers are down. Historians cannot stand behind a fence and lob histories over into a waiting populace. We can, however, stand next to a historical point of interest and help people understand what they see, hear, smell, touch, and taste. If we believe what Shirky, Sunstein, and Benkler argue about networks, deliberation, and collaboration, we might even learn something from their experiences.
In my last essay, I discussed the idea that history represents but cannot replicate the past. Each history of a time and place will catch different sights, sounds, odours, textures, and flavours. If historians can accept that fact about their work, how could we deny anyone the opportunity to view and interpret the past? Each perspective offers a chance to see something previously unnoticed. Each step on a historical site might uncover an insight or question not previously stated. Human curiosity and desire to participate is raw but potent, especially if a historian stands by to offer methods of inquiry, critical review, and ways to contribute. Anyone can remove a tooth, but they can learn to do it well when aided by an instructive dentist.