Annotating Digital Documents for Asynchronous Collaboration

02-09-02 |

Annotations are a natural way to record comments and ideas in specific contexts within a document. When people read, they often underline important parts of a document or write notes in the margin. While we typically think of annotating paper documents, systems that support annotating digital documents are becoming increasingly common. Annotations on digital documents are easily shared among groups of people, making them valuable for a wide variety of tasks, including online discussion and providing feedback. This research explores three issues that arise when using annotations for asynchronous collaboration. First, I present the results of using a prototype annotation system, WebAnn, to support online discussions in an educational setting. In a field study in a graduate class, students contributed twice as much content to the discussion using annotations compared to a traditional bulletin board. Annotations also encouraged a different discussion style that focused on specific points in the paper being discussed. The study results suggest valuable improvements to the annotation system and factors to consider when incorporating online discussion into a class. Second, I examine providing appropriate notification mechanisms to support online discussion using annotations. After studying notifications in a large-scale  commercial system and finding them lacking, I designed and deployed enhancements to the system. A field study of the new notifications found that overall awareness of annotation activity on software specifications increased with my enhancements. The study also found that providing more information in notification messages, supporting multiple communication channels through which notifications can be received, and allowing customization of notification messages were particularly important. Third, I explore how to anchor annotations robustly to documents to meet user expectations on documents that evolve over time. I describe two studies designed to explore what users expect to happen to their annotations. The studies suggest that users focused on how well unique words in the text that they annotated were tracked among successive versions of the document. Based on this observation, I designed the Keyword Anchoring algorithm, which locates an appropriate new position for an annotation using unique words in the text annotated by the user.