ࡱ> M bjbj== &WW mx l  D &&&&D& n'T+T+T+j+1:Tg<,DZɱɱɱ/Ĵ$} h =[11==FT+j+ɷFFF=6T+ j+DZF=DZFFK Ct  j+' ?' &Aߩb4߷0ARD&F  Robust Annotation Positioning in Digital Documents  FILLIN \* MERGEFORMAT A.J. Bernheim Brush, David Bargeron, Anoop Gupta, and JJ Cadiz   FILLIN \* MERGEFORMAT   FILLIN \* MERGEFORMAT September 22, 2000 Technical Report  FILLIN \* MERGEFORMAT MSR-TR-2000-95 Microsoft Research Microsoft Corporation One Microsoft Way Redmond, WA 98052 Robust Annotation Positioning in Digital Documents A.J. Bernheim Brush, David Bargeron, Anoop Gupta, and JJ Cadiz Collaboration and Multimedia Systems Group Microsoft Research One Microsoft Way Redmond, WA 98052 ajb@cs.washington.edu, {davemb, anoop, jjcadiz}@microsoft.com ABSTRACT Increasingly, documents exist primarily in digital form. System designers have recently focused on making it easier to read digital documents, with annotation as an important new feature. But supporting annotation well is difficult because digital documents are frequently modified, making it challenging to correctly reposition annotations in modified versions. Few systems have addressed this issue, and even fewer have approached the problem from the users point of view. This paper reports the results of two studies examining user expectations for robust annotation positioning in modified documents. We explore how users react to lost annotations, the relationship between types of document modifications and user expectations, and whether users pay attention to text surrounding their annotations. Our results could contribute substantially to effective digital document annotation systems. Keywords Annotation, robust, digital, documents, annotation system design. Introduction Four common activities surrounding documents are reading, annotating, collaborating, and authoring. Until recently, computer software vendors have primarily focused on the authoring task, with products like Microsoft Word. The industry is now realizing that, in fact, the most pervasive activity around documents is not authoring but reading, followed closely by annotating, then collaborating, and finally authoring. The majority of people read and annotate daily, but do not create new documents. With this shift in perspective, there is an increased focus on software primarily targeting reading and annotating  REF _Ref492889353 \r \h [10] REF _Ref492979200 \r \h [11]. The reading-centric products are aware of the importance of pagination over scrolling, of side margins, and of the relationships between font size, line spacing, and line width. The annotation capabilities provided to date are, however, more primitive and currently being refined. Cadiz et al  REF _Ref492888819 \r \h [3] report on a recent study where they observed the use of electronic annotation by roughly 450 users over a 10-month period. While there were many observed benefits, a key complaint was the orphaning of annotations. That is, when the online documents got changed, the annotations lost the link to their proper position within the document, and were presented at the bottom of the document. The problem of orphaning is unique to annotations on digital/online documents, as paper-based documents do not change underneath the annotator. As more documents appear online and as other traditionally paper-based document processes become increasingly digital (such as editing and revision), robust annotations that remain associated with the correct portion of the document across modifications will become crucial. But correctly positioning annotations in a revised document is a difficult problem. Some annotation systems work around the problem by limiting where an annotation can be placed  REF _Ref493044699 \r \h [4] REF _Ref492979200 \r \h [11], others silently orphan or drop annotations when documents change  REF _Ref493059675 \r \h [5]. Researchers have begun to explore algorithms for robustly saving an annotations position and finding it in a modified version of the document  REF _Ref492888948 \r \h [6] REF _Ref492889676 \r \h [15]. However, we believe focusing solely on algorithmic approaches to this problem neglects a crucial step. No one has asked users what they expect an annotation system to do when a document changes. This papers primary contribution is to take that step by reporting the results of two studies. Participants in the studies made annotations, transferred them to modified documents manually and also rated how well a simple algorithm positioned annotations in modified documents. Our belief was that observing the thought processes people use to place annotations in a modified document would help us create a robust positioning algorithm that does what people expect. Some of the results were surprising. It was unexpectedly difficult for study participants to work with annotations that they had not made. Even when part of the original text associated with an annotation was found, in some cases it seemed participants would have preferred the system to orphan the annotation. Also, participants appeared to pay little attention to the text surrounding their annotations. In the next section we review related work. Then Section 3 lays out a framework for annotation position information and types of document modifications. Sections 4 and 5 describe the methodology of our two studies and their results. Section 6 discusses how we can use these results to construct better robust positioning algorithms for annotations RELATED WORK Effectively positioning annotations in a digital document is a non-trivial problem. The exact document text related to an annotation is often ambiguous. For instance, Marshall  REF _Ref492889162 \r \h [9] suggests that people frequently place their annotations carelessly. The underlines and highlights they create (on paper in this case) often follow document structure or typographical characteristics rather than content. The positioning problem is even more difficult if the underlying document can be modified. Users are not forgiving when a system fails to correctly position their annotations in a modified document  REF _Ref492888819 \r \h [3]. Thus, previous systems have taken a wide variety of approaches toward solving the positioning problem, as outlined below. Annotating Frozen Documents Many systems simply assume that annotated digital documents will never change. Adobe Acrobat Reader  REF _Ref492889213 \r \h [1], Aladdin Ghostview  REF _Ref492889230 \r \h [2], and Microsoft eBook Reader  REF _Ref492889353 \r \h [10], all take this approach. Other systems have augmented traditional annotation of paper documents (that dont change) with computer support  REF _Ref492889399 \r \h [12] REF _Ref492889380 \r \h [19]. In both types of systems, annotations are typically positioned using very simple means, such as character offsets, or page number plus an (x,y) position. The underlying document is never modified, so annotations never have to be repositioned. Other systems do not explicitly require documents to remain unchanged, but work best when there are no modifications. In these systems annotations are created on any web page, stored separately on a central server, and visible to everyone with access to the server. Annotations are typically positioned by calculating a signature from the content of the page to which the annotation belongs. E-Quill  REF _Ref493059675 \r \h [5], Third Voice  REF _Ref492889432 \r \h [18], and Microsoft Office Web Discussions  REF _Ref492979200 \r \h [11] are commercial systems that have taken this approach; public web-scale architectures such as OSF  REF _Ref461868211 \r \h [17] and NCSA  REF _Ref461868225 \r \h [7] do as well. In many important scenarios such as the Web, however, it is unrealistic to assume that documents will never change. If a document does change, these systems fail to properly position some annotations, and the annotations either silently disappear or are displayed in a separate window as orphans. Not surprisingly, this problem has been found to be particularly debilitating. In a study of the large-scale use of Microsoft Office 2000 Web Discussions, lost annotations was cited as the primary reason people stopped using the system  REF _Ref492888819 \r \h [3]. Our work is aimed at accommodating documents that may get modified. Annotating Predefined Positions Some systems attempt to compensate for potential modifications in web pages by only allowing users to annotate predefined positions. CoNotes  REF _Ref493044699 \r \h [4] requires inserting special HTML-like markup tags into a document before it can be annotated. Microsoft Office Web Discussions  REF _Ref492979200 \r \h [11] only allows users to attach annotations to a small selection of HTML tags. By limiting the places where annotations can be placed, these systems can better control how the annotations are positioned when the underlying page gets modified. Our goal is to allow users to position their annotations anywhere on a digital document. More Complex Positioning Algorithms A number of systems implement more sophisticated positioning algorithms that make very few assumptions about the documents. Annotator  REF _Ref492889650 \r \h [13], ComMentor  REF _Ref492888926 \r \h [16], Webvise  REF _Ref492888948 \r \h [6], and Robust Locations  REF _Ref492889676 \r \h [15] part of Multivalent Annotations  REF _Ref492888922 \r \h [14], are examples of systems that take this approach. These systems allow annotations to be positioned anywhere within a web page. They all store a combination of annotated text and surrounding text so that the annotation may be repositioned later. ComMentor stores key words that attempt to uniquely identify annotated text. Annotator calculates a hash signature from the annotated text. Webvise stores a locSpec for each annotation that includes a bookmark or HTML target name, annotated text, surrounding text and a character count of the start position. Robust Locations stores the position of the annotation in the document tree as well as surrounding text. The annotations created by these systems are robust to varying degrees. Each system can fail to correctly position an annotation in a modified document and orphan it. The systems have varying strategies for presenting orphans to the user, from separate popup windows  REF _Ref492889676 \r \h [15] to placing them at the end of the document  REF _Ref492888926 \r \h [16]. While we build on the work of these systems, taking a user-centric approach to the problem of robustly positioning algorithms will help us determine the appropriate annotation position information to store and how to design a positioning algorithm that meets users expectations. FRAMEWORK Approaching the annotation positioning problem requires understanding two key components: How digital annotations work, and how documents may be modified. Annotation Definitions An annotation is a marking made on a document at a particular place. Each digital annotation is composed of two items: Some content (for example, a user comment or highlighter ink) and an anchor (the information used to position an annotation in the document). Marshall  REF _Ref492889900 \r \h [8] has classified paper-based annotations into 4 groups based on whether the annotation content is explicit to another reader (e.g., a scribbled note) or implicit (e.g., yellow highlighter ink implying importance) and whether the annotations anchor is a margin anchor (e.g., asterisks, a note scribbled to the side of a paragraph with no other marking) or a range anchor (e.g., highlighted text, circled word). Figure 1 illustrates the two anchor types. The highlight annotation has a range anchor and implicit content, and the asterisk annotation has a margin anchor and explicit content. Robust Anchor Representation The content and anchor information for digital annotations is often stored separately from the annotated document. This strategy allows people to annotate documents even if they dont have permission to modify them. However, this also requires high quality anchor information. Without a good anchor, a system cant position annotations correctly in a document for display to users. To insure correct annotation positioning in a document even when the document changes, a system needs to use robust anchors. Robust anchors could potentially use two types of information to identify an annotations location: Anchor text information: E.g., Text under the highlight. Surrounding context information: Text in the document near the annotation, but not explicitly selected by the user (see Figure 1). One goal of our studies was to determine the relative value of both types of information to users when trying to position annotations in a modified document. Anchor Text Information The key role of anchor text information is to uniquely and efficiently identify the annotations position in a document. As was discussed earlier, numerous strategies exist to address this problem, storing simple character offsets, keywords, or the entire text string selected by the user. These methods only work when a user explicitly marks text. Margin annotations dont mark any text explicitly. For example, does the asterisk in Figure 1 relate to just the last few words, the last sentence, or the complete paragraph? Surrounding Context The surrounding context is the text that is near the annotation, but not explicitly selected by a user. For example, the underlined text in Figure 1 can be considered part of the surrounding context for the highlight annotation. More generally, we can think of the surrounding paragraph, subsection, section, and so on as part of the surrounding context. Meta-information, such as HTML markup tags, can also be used as part of surrounding context. Surrounding context is important for several reasons. First, it is the only way to identify where margin annotations should be positioned. Second, surrounding context can be used, as in Robust Locations  REF _Ref492889676 \r \h [15], to verify that the correct position for the annotation anchor has been located. Third, the range of text specified by the reader may not be carefully chosen  REF _Ref492889162 \r \h [9]. For digital annotations, this may mean that people expect annotations to remain intact if the surrounding context remains, even if large changes occur in the anchor text information. Document Modifications Documents may be modified for different reasons and in a variety of ways. It is important to differentiate between modifications made to address annotations and modifications made independently of annotations. A modification may be made in response to an annotation. For example, a sentence may be highlighted with please reword written in the margin next to it. If the author rewords the sentence, it is difficult to know whether a system should try to position and show the annotation in the modified document. We do not focus on robust positioning of these editing annotations in this paper. A solution based on a resolve button is discussed in  REF _Ref492888819 \r \h [3]. Modifications may also be made independently of any annotation. For example, an author may generate a new draft of a document while a colleague marks up a previous draft. This is the case we focus on here. Our modification classification scheme is shown in Table 1. A piece of text can experience three main types of modifications: Deletes, rewords and moves. Note that a single piece of text may undergo several of these modifications at once. Although delete and reword modifications are easy to see, move modifications are more complicated. For example, if the paragraph prior to the annotation is deleted, the surrounding context of the annotation changes without any change to the actual text that the annotation is anchored to. Study Focus We chose to focus on a limited number of common annotation and modification types in this paper. First, because the majority of digital annotations use range anchors, not margin anchors it is easier to highlight text with a mouse than it is to draw an asterisk in a margin we focused on annotations with range anchors. Second, we focused on annotations that were made during active reading of text documents, similar to those studied by Marshall  REF _Ref492889900 \r \h  \* MERGEFORMAT [8], instead of examining editing annotations. Annotations made during active reading are often meant to persist for future reference, thus they are precisely the type of annotation that must survive document modifications. PILOT Study: ANNOTATIONS ON PAPER To examine user expectations for robust annotation positions, we conducted two user studies. The main goal of the pilot study was to explore what users perceive as annotation context. We did this by isolating the task from user interface design concerns and having participants perform the task for which we were trying to design an algorithm. We had participants transfer annotations from an original document to a modified version (on paper). Our hypothesis was that observing the thought processes people use to identify the context of an annotation and place it in a modified document would help us create a software algorithm that does what people expect. Experimental Method We recruited 8 participants who had at least basic computer knowledge. All were either college educated or college students and all read for at least 30 minutes on average every day. Participants received a gratuity. Participants performed three main tasks. First, they looked at a pre-annotated document and told us what they thought the context for each annotation was. The document was a news article with a variety of types of annotations on it (a selection of highlights, underlines, margin notes and symbols created by four coworkers). Second, we had participants transfer the annotations from the original document to a version modified by a technical writer. Third, we had participants compare the original annotated document with a modified version in which a computer had positioned the annotations. The annotations on the modified version were actually placed there by a person using an algorithm similar to the method reported in  REF _Ref492889676 \r \h [15]. Participants rated how well the computer did using a 7-point Likert scale. Lessons Learned Instead of obtaining data about the cognitive processes people use to transfer annotations, we learned that making explicit the context of annotations and then transferring them is a difficult task. Problems seemed to stem from the fact that people were asked to work with annotations that they did not make. We consciously designed the task this way so that we could control what type of modifications each annotation was subjected to in the altered version of the document. However, if a participant did not understand (or agree) with an annotation, it negatively affected their ability to specify its context and to transfer it. One participant quipped, Again we have another star here. Thats a horrible annotation right there. Another said I dont see how it [another annotation] applies, but I guess it does. One participant even refused to transfer annotations that were someones opinion that I didnt agree with. Why should I promote their cause? Rating the computers transfer of the annotations was also difficult because participants were not working with annotations that they had made. Instead of rating the new position of the annotation in the modified version, several participants rated how valuable they thought the annotation was. Also, because the task was done on paper (where it was clear that a person had marked up the document) people had a difficult time understanding that we were pretending a computer had positioned the annotations. second Study: DIGITAL ANNOTATIONS Based on our experience from the pilot study, we conducted a second study where participants created their own annotations on a digital document using software we designed. We narrowed our focus to examine user ratings of annotation positioning done by a computer. Our primary goal for this study was to gauge users reactions to a relatively simple repositioning algorithm, especially when it failed. Annotation Software For this study, we extended Microsoft Internet Explorer as shown in Figure 2, to allow people to highlight and make notes on web pages. A user makes an annotation by using the mouse to select a portion of text on a web page, and then left-clicking the selection. A menu pops up from which the user can choose to highlight or attach a note to the selected text. Highlighted text is displayed with a yellow background and text with a note attached is displayed with a blue one. A list of all annotations for the web page is shown in the annotation index window on the left. This index also displays the contents of any note annotations. All annotations are automatically numbered. Participants could delete annotations by left-clicking on an existing annotation and selecting delete from the menu. Annotation Positioning Algorithm We included a simple algorithm to reposition annotations if an annotated document was modified. The algorithm was similar to the context method reported in  REF _Ref492889676 \r \h [15]. The algorithm saved the text selected by the participant as the anchor and then used text matching to find the anchor position in the modified version. If all the original text was not found, the algorithm alternated cutting words off the front and back of the anchor text while looking for the shorter text in the modified document until it found a partial match, or until the length of the anchor fell below 15 characters. If the algorithm could not find a location for the annotation, it orphaned the annotation. Orphaned annotations were displayed at the top of the annotation index (see Figure 2). This algorithm is fairly simple. It does not take into account surrounding context or search for the anchor text in a more sophisticated manner, and it weighted the center words of anchor text more heavily than the words toward the beginning and the end. We decided to use this algorithm to gather observations of user expectations before developing a more complicated algorithm. We expected the algorithm to fail often so that we could be alerted to scenarios where participants were most unhappy with the algorithms performance. Experimental Method For this study, 12 participants were recruited in the same manner as the first study. Participants were first given a brief training task to familiarize themselves with the system, and then given the task of annotating a document so that it could be skimmed quickly by a busy executive. The document was a general interest news article from the web. Next, participants were told that an updated version of the document was available, but that rather than repeating the task of annotating the document, they would have the computer transfer their annotations from the old to the new document. Participants then examined each annotation and rated its position in the new document on a 7-point scale where 7 was perfect, 4 was ok, and 1 was terrible. In this study, because participants made their own annotations, we needed to create an updated version of the document before the study with modifications that would affect participants annotations. To do this we had a few pilot study participants annotate the document (on paper). Then we made changes in the original document in places where people tended to make annotations. A second updated version was created by a colleague unfamiliar with the annotation positioning algorithm. If participants quickly finished the rating task using the first updated version, we had them repeat the task for the second updated version. Results The main purpose of this study was to examine participant satisfaction with the algorithms attempt to reposition annotations in the updated document. The 12 participants made a total of 216 annotations and then rated their satisfaction with how each annotation was positioned in the first updated version. Half the participants also rated the positions of their annotations in the second updated version. A total of 302 position satisfaction ratings were collected. We present participant position satisfaction ratings in the following sections by breaking down the set of 302 ratings into three logical groups based on the changes made to an annotations anchor text: Same: Annotations anchored to text that did not move or change. Move: Annotations anchored to text that was moved from one portion of the document to another, but that otherwise did not change. Complex: Annotations anchored to text that was changed and possibly moved. We expected high satisfaction ratings for the transfer of annotations in the Same group because our algorithm finds all such annotations. For annotations in the Move group we still expected fairly high ratings, since the algorithm also finds these annotations. However, we believed that if the anchor text moved significantly in the document, this would change its surrounding context, and perhaps render it irrelevant. In this case, participants might prefer the annotation to be orphaned. For annotations in the Complex group we expected lower scores due to the simplicity of the algorithm. We expected instances where participants would be unsatisfied with how much of an annotations anchor text the algorithm found, or that an annotation had been orphaned. We also believed that participants would always rate annotations that were found higher than annotations that were orphaned, except when the orphan was caused by deletion of the entire anchor text. Same: When Anchor Text Does Not Change Although our algorithm is simple, it is guaranteed to find annotations attached to unique text that does not move or change. 47 out of 302 position ratings fell into this category. As we expected, the median participant rating for these annotation positions was a perfect 7.0. When the text doesnt move or change and the system finds the entire annotation anchor text in the new document, participants are happy. Move: When Anchor Text Moves 121 of the position ratings were for annotations attached to anchor text that was moved in the updated document, but not changed in any other way. We focused on move modifications that were noticeable to a human reader. For example, a paragraph might have been moved from one page to another. 100% of annotations attached to text that moved, but did not change, were found in the updated document. This was due to our algorithms use of simple text matching to find an annotations anchor text and the fact that participants attached their annotations to unique sections of text. The median participant rating for these annotation positions was 7.0. The high ratings given for these annotation positions surprised us somewhat. We expected that if the text an annotation was attached to moved significantly, there would be times when an annotation would lose relevance and need to be orphaned. However, the data indicate that this is not the case. Thus, perhaps the surrounding context of an annotation is of lesser importance when considering factors that contribute to keeping participants satisfied with automated annotation positioning. It would be interesting to explore whether users feel the same way about the surrounding context for editing and margin annotations. Complex: When Anchor Text is Modified 134 of the position ratings were for annotations attached to text that was changed in some way in the updated document. Of these annotations, our algorithm successfully transferred 71 and orphaned 63. Note that a piece of text may have been both changed and moved, but since data in the previous section indicate that ratings are independent of moves, we focus primarily on how the anchor text changed. To analyze this set of annotations, we classified the changes that were made to an annotations anchor text. Sometimes just one word was changed, and sometimes the entire sentence was rewritten. Changes were coded using the six delete and reword categories outlined in Table 1, and these encodings were used to compute a modification score for each annotation. Minor rewords and minor deletes were given one point and medium rewords and medium deletes were given two points. Using this scheme, higher scores indicated more drastic changes, with a highest possible combined modification score of 3. Total deletes were treated as a separate category and automatically given a score of 4. Total rewords were eliminated from the analyses because only one such case occurred. Reliability of these classifications were verified by having a colleague not involved with the research code a representative sample of the anchor text changes. Inter-rater reliability for the modification score was high (( = .90). When Annotations are Orphaned Table 2 shows the median position ratings for annotations that were orphaned in cases where the text changed. The overall median score for this set of annotations was 5.0. As we expected, the table shows that participants gave the lowest ratings when little modification occurred to the text and the annotation was not found. In fact, participant ratings were significantly correlated at .72 (p < .001) with modification score. Thus, ratings increased as more modifications occurred, to the point where participants gave the highest ratings to orphaned annotations when all of the text the annotation had been attached to was deleted. Comments that participants made while rating orphaned annotations also support the hypothesis that as the amount of text change increases, people are more satisfied when the annotation is not found. For one annotation, a participant told us that the document changed around enough and the keywords left out of the second article, I could see it might not find that. Of another annotation, a participant observed that the modifications redid [it] entirelymakes sense they [the algorithm] didnt find that one. When Anchor Text Changes and Annotations are Found Table 3 shows the median position ratings for annotations that were found in cases where the anchor text changed. The overall median score for this set of annotations was 4.0. Note that a successful annotation transfer includes cases where only part of an annotation could be transferred. For example, if a person made the following highlight annotation: The quick brown fox jumped over the lazy dog. Below is an example of modified text and the partial anchor text our algorithm would have found: The quick fox jumped away from the dog. To take into account partially found annotations, we also examined this set of annotations by looking at what percentage of the annotation anchor text was found in the modified document. These percentages are listed in the columns of Table 3. The data in Table 3 suggest two trends. First, not surprisingly, the greater the percentage of the annotation anchor text found, the more satisfied people are (bottom row of Table 3, read left to right). Percentage of annotation anchor text found was significantly positively correlated at .65 (p < .001) with participants rating. Second, and somewhat counterintuitive, the more drastic the modifications to the anchor text, the less satisfied people were when the annotation anchor was found (right column of Table 3, read top to bottom). Modification score was significantly negatively correlated at -.34 (p < .003) with participant rating. This was unexpected. We thought that participants would be more impressed with the system when it was able to find annotations even when significant changes occur, but this was not the case. Finally, somewhat surprising was the participants median ratings of 3 for both found and orphaned annotations with modification scores of 2 & 3 (see Tables 2 & 3). We had expected found annotations to always be rated higher than orphans not caused by a total delete of the anchor text. Discussion The results from our studies provide valuable insight for designers of annotation systems. Surrounding Context is Less Important As noted previously, robust anchors can be created by storing an annotations surrounding context and anchor text information. We were surprised when our studies indicated that users may not consider surrounding context very important for annotations with range anchors (even though it may be crucial for annotations with margin anchors). We observed rather casual text selection where annotation boundaries were influenced by document formatting (for example, ends of lines) similar to Marshalls observation for annotations on paper  REF _Ref492889162 \r \h [9]. We thought this might cause participants to expect the annotation transfer algorithm to perform a more sophisticated search for the correct text when it was deleted, but the data do not support this. Participants gave very high position ratings for annotations attached to text that was significantly moved, and for annotations that were orphaned due to the original text being deleted. This does not necessarily mean that robust positioning algorithms should not save surrounding context. Rather, users may not consider it very important, so it should perhaps be weighted less heavily in algorithms that do employ it. Future research should examine whether this finding was due to our focus on active reading annotations instead of other types of annotations, such as editing. Focus on Keywords When examining the particular cases where participant ratings were low, we found that participants often expected the system to do a better job locating key words or phrases. Comments included: The key words are there, it should have been able to somehow connect that sentence [in the modified version] with the original Should have gotten that one, at least the quote. Should have at least highlighted the name. Doesnt pick up a change in wording that means essentially the same thing. Thus, when designing robust positioning algorithms, it may be helpful to pay special attention to unique or key words in the anchor text, as the ComMentor  REF _Ref492888926 \r \h  \* MERGEFORMAT [16] system does. Participants also appear to consider names, and quotations as particularly important. A simple thesaurus or grammar parser may additionally be useful to recognize when simple rewords have occurred that do not change the semantics of the sentence. Orphan Tenuous Annotations Based on Tables 2 and 3, two trends seem to emerge. First, if an annotation is found, users initially assign the highest rating and then move down the satisfaction scale based on how much of the annotation anchor text the algorithm found and how many modifications occurred. For orphaned annotations the process works in reverse. Participants start with the lowest rating and then move up the scale as more modifications are noticed, or when they realized the entire anchor text has been deleted. These trends suggest that there may be a point when, even though an algorithm may be able to find a highly likely location for an annotation in a modified document, the participant would be more satisfied if the annotation was orphaned. Further testing this hypothesis is a good area for future research. Include user intervention If indeed systems choose to orphan some annotations even when they have a relatively good guess as to where annotations should be positioned, it may be helpful to provide users with a best guess feature that shows them where orphaned annotations might be located. This feature may also be helpful for situations where users need to insure all annotations are moved to a modified version of the document. Some of the systems best guesses may not be correct, but they may provide enough information for a user to easily reattach orphaned annotations. CONCLUDING REMARKS The primary contribution of this paper has been to explore what users expect when they make annotations on a digital document that is subsequently modified. The paper also presents a framework for building robust annotation systems that requires us to ascertain the relative importance of surrounding text vs. anchor text, as well as the kinds of anchor text information that is more important to users than others. For the types of annotations studied, our results suggest that participants paid little attention to the surrounding context of an annotation, and algorithms may want to give the surrounding context relatively little weight when determining an annotations position. As for anchor text information, participants comments stressed the importance of key words, proper names and quotations. We also found in certain cases, even when part of the annotations anchor text is found, users may prefer that the positioning algorithm does not place it in the modified document. The detailed data we collected are useful for determining potential thresholds for orphaning annotations. While our results have revealed valuable information about user expectations and will help us design more robust annotation positioning algorithms, much work remains. Future studies should explore how our results apply to other types of annotations (such as editing annotations and margin annotations) and to other types of documents. ACKNOWLEDGMENTS We thank Molly Brown, Mike Brush, Duncan Davenport, and Scott LeeTeiernan for their assistance. REFERENCES Adobe Acrobat,  HYPERLINK "http://www.adobe.com/products/acrobat/main.html" http://www.adobe.com/products/acrobat/main.html Aladdin Ghostview,  HYPERLINK "http://www.cs.wisc.edu/~ghost/" http://www.cs.wisc.edu/~ghost/ Cadiz, J., Gupta, A., Grudin, J. Using Web Annotations for Asynchronous Collaboration Around Documents. To appear in Proceedings of CSCW 2000 (Philadelphia, PA, Dec. 2000) Davis, and Huttonlocker. CoNote System Overview. (1995) Available at http:// www.cs.cornell.edu/ home/ dph/ annotation/ annotations.html. E-Quill.  HYPERLINK "http://www.e-quill.com/" http://www.e-quill.com/ Grnbk,, K., Sloth, L., rbk, P. Webvise: Browser and Proxy Support for Open Hypermedia Structuring Mechanisms on The WWW, Proceedings of the Fifth International World Web Web Conference, (Toronto, May 1999), Available at http://www8.org/w8-papers/3a-search-query/webvise/webvise.html. Laliberte, D., and Braverman, A. A Protocol for Scalable Group and Public Annotations. 1997 NCSA Technical Proposal, available at http:// union.ncsa.uiuc.edu/ ~liberte/ www/ scalable-annotations.html. Marshall, C.C. Annotation: from paper books to the digital library, Proceedings of Digital Libraries 97 (Philadelphia, PA, July 1997), ACM Press 131-140 Marshall, C.C. Toward an ecology of hypertext annotation, Proceedings of HyperText 98 (Pittsburgh, PA, June 1998), ACM Press, 40-48. Microsoft ebook Reader,  HYPERLINK "http://approjects.co.za/?big=reader/" http://approjects.co.za/?big=reader/ Microsoft Office 2000 Web Discussions, http://officeupdate.microsoft.com/2000/focus/articles/wWebDiscussions.htm NotePals, http://guir.berkeley.edu/projects/notepals/. Ovsiannikov, I., Arbib, M., McNeill, T. Annotation Technology, Int. J. Human Computer Studies, (1999) 50, 329-362. Phelps, T., Wilensky R. Multivalent Annotations, Proceedings of the First European Conference on Research and Advanced Technology for Digital Libraries (Pisa Italy, Sept. 1997) Available at  HYPERLINK "http://www.cs.berkeley.edu/~phelps/papers/edl97-abstract.html" http://www.cs.berkeley.edu/~phelps/papers/edl97-abstract.html Phelps, T., Wilensky R. Robust Intra-document Locations, Proceedings of the 9th World Wide Web Conference, (Amsterdam, May 2000) Available at  HYPERLINK "http://www.cs.berkeley.edu/~phelps/Robust/Locations/RobustLocations.html" http://www.cs.berkeley.edu/~phelps/Robust/Locations/RobustLocations.html. Roscheisen, M., Mogensen, C., Winograd, T. Shared Web Annotations as a Platform for Third-Party Value-Added, Information Providers: Architecture, Protocols, and Usage Examples, Technical Report CSDTR/DLTR (1997), Stanford University. Available at http://www-diglib.stanford.edu/rmr/TR/TR.html. Schickler, M.A., Mazer, M.S., and Brooks, C., Pan-Browser Support for Annotations and Other Meta Information on the World Wide Web. Proceedings of the Fifth International World Wide Web Conference (Paris, May 1996), available at http:// www5conf.inria.fr/ fich_html/ papers/ P15/ Overview.html. ThirdVoice web annotations.  HYPERLINK "http://www.thirdvoice.com/" http://www.thirdvoice.com/. Wilcox, L.D., Schilit, B.N., and Sawhney, N. Dynomite: A Dynamically Organized Ink and Audio Notebook, Proceedings of CHI '97 (Atlanta, Georgia, USA, March 1997), http://www.acm.org/sigchi/chi97/proceedings/paper/ldw.htm.  Current Address: Computer Science and Engineering Department, University of Washington, Seattle, WA 98195 Figure 2: Text annotation software used by participants in the second study to create notes and highlights on web page. The annotation index lists the annotations for the current page, including the orphaned annotations that could not be placed on the page.  Figure 1: Annotation example showing anchor types and surrounding context. In our user studies we focused primarily on annotations with range anchors. Modification ScoreRating (number of annotations)11.50 (12)23.0 (18)33.0 (7)4 (total delete)7.0 (25)Table 2: Median participant position satisfaction ratings, on a 1 to 7 Likert scale, for annotations where the anchor text changed and the annotations were not found (orphaned). As the amount of modification to the anchor text increased, participants were more satisfied that the annotation had been orphaned. Modifi-cation score1 to 24% found25 to 49% found50 to 74% found75 to 100% foundOverall13.0 (3)3.0 (13)3.0 (19)6.0 (18)4.5 (53)22.0 (9)3.0 (6)4.0 (1)5.0 (1)3.0 (17)3-3.0 (1)--3.0 (1)Overall2.5 (12)3.0 (20)3.5 (20)6.0 (19)4.0 (71)Table 3: Median participant position satisfaction ratings for annotations where the anchor text changed and some percentage of it was found. Participant satisfaction is directly correlated to the amount of anchor text found and inversely correlated to the amount of modification that occurred to the anchor text. Number of annotations in each case is in ()s. Modifi-cation TypeModifi-cationDescriptionDeleteMinor DeleteBetween 1 character and half of the anchor is deleted.Medium DeleteMore than half of the anchor is deleted.Total DeleteEntire anchor is deleted.RewordMinor RewordBetween 1 character and half the anchor is reworded.Medium RewordMore than half the anchor is reworded, reorganized, or split into multiple pieces.Total RewordComplete anchor is reorganized. Typically only a few key words remain.Move Anchor TextAnchor Text IndirectAnchor text itself doesnt change, but the text around it does.Anchor Text DirectAnchor text moves within the paragraph or changes paragraphs.Move Para-graphParagraph IndirectThe paragraph in front of the annotations paragraph changes.Paragraph DirectThe paragraph containing the annotation moves forward or backward.Table  SEQ Table \* ARABIC 1: Annotation Anchor Modification Types. The table presents different types of modifications that an annotations anchor text may undergo in the document modification process. We use this classification in our study to understand users expectations for robust annotation positions. 9:RS  m     ;<UVWZ[!89RSTWYrstxynojqUjUjwU6]jUj}UjU j0JUCJ jCJU jUG892HZm$<a$!$a$$a$#" w!3E  ^ k _ /tI<IXv & F & F$a$$a$$a$).v()*-.KLefgkl3459:j6U]jY6U]j6U]j_6U]j6U]j6U]jeUjU6]jkU jUjU;v/"$$B'f')v+, ..../D2a2344}5646B8V8:|< & F & F- & F & F & F: !! ! !!5!6!O!P!Q!U!V!!!!!!!!!!!!!""""%$&$?$@$A$D$E$$8%9%R%S%T%W%X%%%%%%%%@'A' *j UjA U jUj 6U]jG 6U]j6U]jM6U]j6U]jS6U]j6U]6]=A'B'f'''((( ( (((3(4(5(9(:(D(E(^(_(`(c(d({(|(((((((((((((),,,,,,,,,,,,,, . ...D/K/////j)Uj U jUj/ 6U]j 6U]j5 6U]j 6U]j; 6U]j6U]6]6B*]ph>//////W0`00000[1a1D2`2a2b2M4\44445V8W8:::::;;;;;;;;;!?"?;?^v`> & F & F`8kUuQFxƢ* vwxdŗƗǗژ*+PQRij RޜߜDEԟ՟+,-uvâ '( uxyz{~j`6U\ 6CJ\] 6CJ\ jvU j0JUCJjU j0JUjUH*jUj,UjU0J jUjU;x{|}~,@a$If$If $$Ifa$$L]^`L ,>@adnqz}ا٧]έϭЭѭCJ\]mHnHujCJU\] CJ\]CJ 5CJ\5\CJ 6CJ\6\!abdn4zz $$Ifa$|$$IfTl0F 064 lanoqz{}٧ڧۧܧ0,l~||| $$Ifa$v$$IfTl0F  064 laܧݧާߧ.?G $$Ifa$GHJ $$Ifa$$$IfTlֈ%| W <04 laJR[dmvw5$$IfTlֈ%| W  04 la $$Ifa$wy $$Ifa$>d55555 $$Ifa$$$IfTlֈ%| W  04 laŨΨר5$$IfTlֈ%| W  04 la $$Ifa$[]533$$IfTlֈ%| W  04 la $$Ifa$]^_rتu0llc $$Ifa$ $$Ifa$~$$Ifs2FC8 0a    4 sa $$Ifa$ ت٪ڪvvm $$Ifa$ $$Ifa$$$Ifs4#FC8 0a    4 sa :vvm $$Ifa$ $$Ifa$$$Ifs4FC8 0a    4 sa:;BO(vvm $$Ifa$ $$Ifa$$$Ifs4FC8 0a    4 savvm $$Ifa$ $$Ifa$$$Ifs4#FC8 0a    4 sa=Xvvm $$Ifa$ $$Ifa$$$Ifs4FC8 0a    4 sa=>Oevvm $$Ifa$ $$Ifa$$$Ifs4FC8 0a    4 saPvvm $$Ifa$ $$Ifa$$$Ifs4#FC8 0a    4 sa [vvm $$Ifa$ $$Ifa$$$Ifs4FC8 0a    4 sa[\]nXvvm $$Ifa$ $$Ifa$$$Ifs4FC8 0a    4 sa}{{{{$$Ifs4FC8 0a    4 sa$&PP/ =!8"8#8$%' 0&PP/ =!8"8#8$%/ 0&PP/ =!8"8#8$% P ' 0&PP/ =!8"8#8$%}DyK _Ref492889353}DyK _Ref492979200}DyK _Ref492888819}DyK _Ref493044699}DyK _Ref492979200}DyK _Ref493059675}DyK _Ref492888948}DyK _Ref492889676}DyK _Ref492889162}DyK _Ref492888819}DyK _Ref492889213}DyK _Ref492889230}DyK _Ref492889353}DyK _Ref492889399}DyK _Ref492889380}DyK _Ref493059675}DyK _Ref492889432}DyK _Ref492979200}DyK _Ref461868211}DyK _Ref461868225}DyK _Ref492888819}DyK _Ref493044699}DyK _Ref492979200}DyK _Ref492889650}DyK _Ref492888926}DyK _Ref492888948}DyK _Ref492889676}DyK _Ref492888922}DyK _Ref492889676}DyK _Ref492888926}DyK _Ref492889900}DyK _Ref492889676}DyK _Ref492889162}DyK _Ref492888819}DyK _Ref492889900}DyK _Ref492889676}DyK _Ref492889676}DyK _Ref492889162}DyK _Ref492888926DyK yK `http://www.adobe.com/products/acrobat/main.htmlDyK yK >http://www.cs.wisc.edu/~ghost/DyK yK 0http://www.e-quill.com/DyK yK Bhttp://www.microsoft.com/reader/DyK yK |http://www.cs.berkeley.edu/~phelps/papers/edl97-abstract.htmlDyK yK http://www.cs.berkeley.edu/~phelps/Robust/Locations/RobustLocations.htmlDyK yK 6http://www.thirdvoice.com/\HDdQz8  C jARC:\Annotations\Figures\Study2Screen4.bmpbzGVhG5h) VGnNGVhG5h) PNG  IHDRvxsRGB pHYs+IDATx^ -UЉ3bd?iFL # #2@& -m*(ґ+`h0^II8ky~ڻvUY{~kU묪sƍx@ @ i~۴ጆ @ @(1 @ t @DT@ @ @ @ @J   @ J  @ @Pb @ Pb  @ @@C @ @9@ @@:@D @ @@ @:"* @ @@@ @ Ё%Q@ @%r @ (1t @ @(1 @ t pƍԠ{!Coً+@ &'''kZX%ő3! @ wW^~:%| hd ' @&&2d p/-F !@ @a`#>t˧Z9W"|[ Jon>WsM{?&;ko 'qqȋt>"F3nD1-)O-{Қ/%Ap#^ps!;&s͍N0p>bᄉwsM{wSyGNػ/7sU0#\U}|w+)M`k>s1=F9>{{w'lOh 5ݷv:Uxkٻ='PNF^:w1ϝsqGc ] DK$@aI̵ ymo[Ζ3gtX냋KNƯN\(VPb8'f:uN" 1d0%HE/|nPb3W~ 홯(1# 9Qbİ~%rR%WQ^׽ o>1.%J 6-έŕwjvkO u۬Np}]٣t}ey+^N˛2rw^4zbpmaw,9 E*(1*"xK!%4z`?뮻nh_fQt1CPb_Lh*lὡ׽_x >#Re@C!\4ps]58R \NXRbH,Pc5"Qq-?]()L %J vig8spݯGĠE`JޖFQbz:-v[Davrp}׻!M~Ո+|_=?764O,?F2/Sꯙt=U_lUC+s?:}nׅ˿?=~F@@Lg3͈Z%Cg |vE>=g= |+_}SeY\0FJi5O?{צ`Џ<͟ uE~㯿YwsEwow*kos(0moE=fUy!y@%׾/\z>xKn/}/IOz2^ߨDⱟ;7\>=Sf6O?yF_}i/ïտw_+MMyZ%r {ń>"u?5~ͧ|/5ﴗn2?Us2G#G<U%}0 2<=_$Ξf^]<ʂ[_0] o/̿oo^w+/~;_oz%);>?υ^%g||ۋJol-o-u/4Oya׾_KV|_<*7 ^W洝n3|7y4~kϝ>sk~{?9}}YYy{lgϘyi\@ U fİar^G_xsxZE.x^S }{j]FS\0Կ||i+/o{Ӽ~ UJ?^HK;,z42D03%z~ﳿ<͟\e﹧+vS\0}?>y?_qr4/y `6zb* f~WySjT1_]k?N|VOUU'ӈз iBGךyaj VF 1/?|Ze0~|싿żp 9ja %>%QFF\abCO7HoɽO|W릑A%UŠte8Jb"0z^oL}Aii^_(aHy{2Ƚu>V炾` *t1<#<yD!㪟Y1USݘ6^ka?^"* РCg 42S_h&_# bod hH,0 SJ|_^ G(ĻU9 s(1 0#ӛԯ]6ytzV\=f{^܋ᶧkoދR_{1<掏^vn?wZ^ rG30Sh0wa0v7(wdpr =zb0wa0͋?xч܂Qy/ e?Zi^JC~ҼBa`d㸷{4rGޑ^%!bc~Ѽ8#;2{4X)bu r#cxD g`8ݣ\.q/Vov{1 q gw|4|Gh'].aoދW@@X~2JI~}+^@7~-Ϗ[?M!E{޷կ>2* __U_u}/]>쟳zuidwl|O-} 9㴽9hbH/b8_xID %띢_\49!+_7W|R"P;P{ 6qQJ NQ&C'"/\*}0_$&eEH~ᄉDUK|SV'Щݔ(1@.vЁ6õsIԻpȠsb7}'M:+ C,t1 Pb@wB J r5ɤo)ʨ1%] 绝@Ms$%)sRb8M@.9 )PlQ-Žul9` B5`3%}xV] +g PbX=0hҽ9݅s4[%;:||1LWk{˛-\(aM='J/.JW/lo]=g-ݷ}ԯG{18k`<=Mݤ0% G!%sNA`  C@y+a@ pBAİf&)1OMY @r=Lo@`AT!`?'FLL@ İ~`UVCpK \41d0  vT$WU#m#m @ l%D ;${yrw5c @`Yϔyn;}(P"8#$ @8O~ |=UdnN @a922(1Lܧw뎝MXF+stӋp+gD  'ҹS>?k{>˿?~ Ox8M[/=O}Ï|ߟx3f3_}߼̷~/RHN Y8.]tZ'tr-Wg׿rt SSt:Y+@ l[/;O]81Ao=}nx􍏚W~» B`at a}& I~Y6MĆ7WṐ`!P |wbư ;@'Az?~7+ KO|?߽ϟx;No`jg5Sz0 8R(1X@N KT9bO?Yx \L\WRUR>bjp! ;/{_i_O܅ŧOxGGs{ 0J CcE`5m_Yxg:~W8ʀT o~ wjoL)$!@3xcw<\ ?ǣ'<0Mޕ!szի '''B.) ŠefK̒Ol(FSseD1^@\E[?7s>O~s# XBt1 @N@ MPv1]֛@LE>=_zm'?Q{'=>3sG66߅] ]0 @~Ro&K`J fWFTܜ % PbX6sA @8m|zåRJ-Hس)\իWMݚ07 @(+  @aɉIsC @1J ;.A @#/J,ǚBE sVN@ Kʕ+șyONN Eá*nA5!@ \dHp y@ @H C&_ @ ,DBfa2O~rAxD @IBz@PB ɣ/0+ќeGID+ ZHׂ ^|nGk_<7͐K{b//}޼Og<V?$Hƚg<ۼ//͟2\(QD Pb}q%jdG 'yO]xWUfPKc~_׾XBWS __|G~雿C>C.}E|V//'&IjO<=b>q#TUnGS'#-Co~y_%S_7Qe 6؏8(1L{B`0\(1X@096xC,59{_l@ ݸqcpk Etii1՜?}ٟ/x ~~j@`9>acɟj5Rh}0=|ma6\ jqER j,5c#a}>Ax@;#@agŝ pїof!).y8r 0SzŰJ}swU9]W/{˾^'|b2] 9sY̋>? n/|~oNWtU+j0uaC Kj-& ^-+obf'5ӿ`^6GB$ro ʓWX ?qZ[_T4ͬ̕F>?,}M},ߡa\S>k8- MLC%rc)yh01Ag<7P뻾+vLoK_Rג&[_uq2K (1t*4\ۨqa2/C0͟c62F 6U_m6f1[eiXe,7! fY0MoblF=e(y>s>GD}l 5Z#5HʻˣVnq_5 cL\5C*PVa  Н%HQ:R_0%䤭ΥOdUi S\0_U_%׆5ի/~2 kki^Me9yya1LqgL00s85y b01 s)gwVI+ן;~-0rsGQk =s;][_^LW;ũT60%yc ^YTk(۾mb=gkv>=( {fq7t^w}֗_>{f=)K1[F[S0//R_0vo;YcyO=Ba^Oh?Z)p{nZ0 2 j)._}Ycʽ43o})E}k!ss҇?̟ qjg[ ZP1~eA4kZe|_j^ eC1՜P4͙G2H}AKqAȏY_0x򔧘*yy[ymJ1(nJ}l1+Fy8}Pk|05i族ܶCa|涋\a~k,~Q\"d۲Bxe9v;dT?c! PE{ {Փ 3\-O81c*k`;=ym~0[dwT7œ#^|Sꥁ)..Ny؟cfR5.fcV#\-5"j&z>7uox6ہs]R.Jw͜3[_\Y?[M(r8yƇ1ӹ`SDHܡ+G;b]UNJymٖ?_C[!e8{&,q)Ui<݊J )'"}Vf(#r}=jV\%@-J )K$.^ba$)Ɣɟy+ZCw65kj=WŔ)1TBİ~Zï5K '+^6c]=6 $Gh0:q͡a`R܋!u=?-A`I/[Ò9\ З^?[RB c}J ;4kkךwV(K @PbȯWYͦ]>w3cx:ݾqǞɟj2 ݇A[jl[~濁O-SjyK~p,[3ͫQ뒟%C϶D)$gRZٽp݌իnM6?`~!<*(8QkJP3Q ;?R;i|{)YJ[DxG)BwoFW50 e&ؽxn!b4.LNUQuCvqæC1@%sˤ~xܖdQ5EdQ79D?CS #0Ġn\/r]FWUS4kʤL==Y5+`utT2\z3s,y"45eeMNfl GPuSVe /WX&b/~aMnpBB:u4)LGWpE')6-lt ٯJɘShYLLZ}"U&'_;\c Nvnt1@`%鮟خmϓmuNYT>ՋPUSҼ`5kP} JwK%k"WrR;@2&p ei;!:E kY˼ ڎmV BK ,?qv~ul?;-2U)>܋&;*Ͽ;~p,{~FʞEŵĈړZxv.oҰ <$zNNjS h'):xãL\GA2J)fbzFCB ÃF~ҪLhsZ{B`u?ZmaL}pE3za ha*5UMܯfʉ>ܭXcK k#i]q[\(]۫X#+5e$Iv8!SuG+#X8s6k_ǜtB@j!7,Cwi@ ' C @`L-%[)m՝Ye \4@dZ&bs`kdkg̮*ϝ;8@42L'c yj4S#  !fL6UL%IVΤ Q%{۷}Q]A}axa @ CcF.>v}1exmjjdvT5qvoh. @!p{2yuh @`pv%?hPUSb0vV. k[KWJI% 6JVuͤ3t߭5;B {Og x^J kC`WUM鵍 hbnme6,}5e gQh:߆Q )1TS0 DXҗL@-EST顗94HMU ChΪO*苾诧'B @K|yNP{C}F ЖÝ BJ%VƼ\̛ XK3ye3MZ@ 0F>ӟJJ SK*"=?uHѦ|}A<&yE9R;{/] ^op @Re8z %p*D;~e=b1fJW2AJۮShd y{aQNs~z-yvE0Ok;x; !@ K$U?s?Y*TBaDEotHM[χ52n9ӱ4Oe`vf/ݣ|>mo6oy-S ox_:o)wq?v{K>xx2{(G%}-hȤO\l* D -/DO_2///yqwjּ]1|D'0aqpC6Zb0Ϲ׼=_}/^S\0UrVa 7XJ[-bJ4%FF XnÏ ,vT*p}1H`/HqT~~o~RB(1l1/սN6솀|kg^ӺmFS4׾]}?4,|_H7@HCO+7]}}g<;^s> i.0>~g {ܓq]K,5|5St!e{4IvҢ+BJG&eXH8hș Н{\_G}ԔoGG#@h//|O׼%_zӥNxcK{/0[fb߁{ ބC2C4¶,RUyI-z @`,-"D7Fw Б%0Jڏ]=S*Nko = ?~ٗ^e};|eSY[_Hu1d.]{ʯJGۼ)I/@ޭw6 ooxNe#D Pb 1(4#h6BxU%ϽVЈyLz&jϸ0a__%aVR\OR"ӕ Vخݮwxn2QpNOy4_YPSź p??Yf,J82J Gw *LB,U "l fc=fn`ywi v%lb>-V>ʾg,kRpMr'zE2",@dfӅ v  Co'(Xr>RhPBt߫.]iTz @s2f  ;&.l@x+` bw+W}س S>ck'o~ɉz)g|__A ,F@Nә{[NϔCsO S`"HÊ= a~ ^bvJa= @G%p]wUN X%%ip]ØbLJX@. Ű˰PbAwag'@ .X >»/á , @>Xbveg=1Q٨STߞQK (B,E`߅{uWyW@$!@F 0c$aj.CU]3ʫ䭰[FQBBjO+yZ&탉XEƦrԒi4iJ-0&ɻNtvmѩͤ*\}+2a:%3tP_Xeа v& @;# W=ݙ dHU#ؼz ИAAC J^O-1!hDGZmp.WUrwc]V2_Ej^.jQ-ogԒA7ޤ(dϗT݄ /a>@Ubgg9*^=.Q$!B{E;Mexc9S2(42Tfq7 4g5 Mq3f8}SDՙ:)cE8hbNjv S0; ),1\>) LѻҗbY#RSs(•wxR'E3;uHRKV(SpqxzZj$C#˪fOe`Ѵ e #q<([ Z4|bg"m q<3Vz4!g* Gs@L>!,':]*+M6z=(]w( NWuPJEVnj,sˀ!(//JLg3RGɉk7U?0Z拑S 9@`(po9zl'-2aCR!toh9Bf|}a,82mo6UWU :uAp)#)XHWG}G a`TbXe5ë/ȰLTL-Qh2!DkȟjcKBzJmFƍQG&@nJ %._~e@ ppt1<pJ k%zBpM<۪ cb73U 1?Dqk@] #G$PWb] :_RUhړ#(̫z1V*c~@a@PA 1Xΰ iR̝7Sȿ}WsP-.X\E(4lq?аŨo[Y0~C.O{X&Ͻl xt[(3DM5D*gk_rի^ ''BK StXH&4zD1^Dk 憣Cs7hrRFrt2`<  N`|!żT] gêxڔNv ؎#0Մ9a@@U} l@ :V@ ^ \=Sn6E@ @ B`] R_@rk?w 2p! @ *L׍(1L 0n4w*>3 n66/-'PT"@D!&җbZ=9 @AKjwrke֚c)1t*@*G(OF @fqkךo\)16\x"kVD4WRhE .V|݄@5ow#E! HR @ah3J {. N-<0 l!}n'/k!Zk9HRb*:!@ a!q`uZְP Q@ @8Ő_l51"ּy(1 @ P Y!{␌DmmeT_5:527:TA`gʕ+W^~ɵ~S0 ݽvjh^sU68{c?#HQC.F(y䤨'#`#gjSa, @/*~wErnZ6+2nA?E_m!| 㲾*ьMM+n[o>w:݈P"ZK[![jcaɹj9 ?22g`  m~U>67˛.eE4X1W =,Dl&Jf'İFB @K삳(7Eu*H}Yq?[8,㽷7cR/h,쁔4\Λ/\$642e%z@ 응-nhf,ڝ(c r!I`[] zAtiZ5J[U&eӣQ)&zЋ$z 0 wqe_{/‰3E$ 5z)g+^cɰF6oe Զ6akmWɴ [b'c}]%fyra5K#%Y2E.nu[YIwh(1l%o4,«-&Y'^ǒ{!@`*t1D[1WS3ᒇ oG <!Ġg$H]`f긮+ j/p`]?XuK*Z*5y@8}w1,_D ]e0N,`]7h`4K(bJسuHD3# @6`״ޚshj#%y' /8-]B]h3н{mՆ Nj۝46q֐Fy/MT|w!!@X žr_-Ov.,Y_m{*ؙ>U u'yו匼6w ' @`bwa/8u.X)1l @V#.Vsk%L{"y azVF!CCF1k֤ ] ߪ ٛihk:WQbS4 @F] ְ߇w d?Ej+aF5 %~&@ @`GFbhXTwϽɁEC/ @vE ] mdh4 qm_lCHVuv1î8@ "0lC`/J+J.4JyVjQQC3Jꉠİ@T8\IZ5, @`bh^HW)y]{2jo0xLo LWT_İ"m+̈́ C!@8 Ѻ5;|˝y +G"la/C/6:FƺW^elsL-P0j#wߵ\rׯ\ST ]x#}0.i(W}ݼƍ:zC)KQCs7hr% @%P`d)wԞ&i S1S% zMXlg(T(jA{"|AVKY~JJ= %"" @;&PUe0v!F! -r+!] >=4VR^LSx5= @ $@YZj!Khtc:c @sanJ )m|~? fBFa& @`t1,ÙYDÞ/#^%!7_t^YPe&.IJ @ka-̻];,<"dßu2+yF\؛>x:aѴ&yCFxz.]7|* @CaȰ`(1 7YG+nHYQ=fo{Gƻ} @ vJY%P3 @ %t1l)Z:J c+ @ 0X@0g(1l H@ ,O.3 Pbz @] `E Pbuxq @h%@C+9%!@ HPK y@ @b8Dq+J ]q  @ IX%X3 @ !t1l(X:J 3 @ 0V&3.BA0|0 @X] kPgmPGYnx82sA _ы] H8-1%V\bIh< @!nܸ$Oz] H8F,1ȒXj ^$QqdXYlI>pWYR%/|yd}4V~fY`2pd)Lrߵ\rׯ\;ˏ7 gf_e?/}>=",E&M{FdIw{rrRԓ0zG]}K^$y3%7mqOkբ'&eZtiFBOTN/36v5Ү73ƅc=2{7kHB<xJcEe 9_1/U8ze.;kyq?Rb1BkL"p`)313#:G~=ۯgAAފɇSh7{ݱGT5oxxR`D]>=XsVP1:CT8<̜dtN9)jLZ6i=(cu1}|FNkB `)2DgBۊfd_ڼ1dVJ@N+gvIEO@3v|L8EEsTD-ʌ宵/(j2rthi٪ E}U:VI h|S ,7Xo|s͋ MԁBY":QНѐ:S&^u;(iW.#. h~nkO{Wdl(Sf_ _⤮HPU.(p2jqr|~y|CCJ}35O̳A2pϒ8XnN.̰jkQšSKLjn$ζIuSBƆ{e(%.[˖pUNY[l")D i2 ڐ |Gxr6Ԯfze,g% h*Y}52G4REgImLT%3Ǣ~wqRisB 5lvA?pH|588wS+9[jyx9fAU_[҆&2mz4v6oOJ|A)T .emAgg7)1w{Q8\ɇ3^<׀m \Alnr2dIQ(J=ZTݵ-%YW1yk;w@߭.)BK|*]t3̠H4]̎4D7> :7qmGE->RbE6$@m9tONrx!rYٯ=Jv^~by-L.,=}P:)1xmwrS힝m `aFgB` c2x2'jjf`ƞēXhg\;o˧)Lբz: eXbbO*Vi4L**(mS"2;K*UkW)s?& @R8]sL7 an;=LnU] &E]-WDJU;4&UyFm]{զtM3ismBiC_1G}gmS9'^vQk׋m͘[k0ȜWb՞3ʟ6 & //r%fenॄDݨ|-bV85܅r6Z3D[_Ng5\ҙ%w=pEW5ѬQ!M4X`Pc*f"ַ\;lrSY}EBhA` fj"ײ4ʆk1*5:uўPmgqnmE^{pg1:6 7V񀀒@qSٙX |P qL9 2nanlqogG)14l(F4R5CG9TACX#3in;hO5JŀT]}NnRzw@bw.aR=# Y#bϼX&+v. zI5ͨ2PmY6j{կ8@}wy A4P,ps]Q 6ËZsܦsu@VmnGS0vxd<C=G\Щ'%VBc2  B`s] p۟^ m6js0*1 3zudG+-!d.$Mg]@<W L @`]#w1J1womV2ttܫ2؛5yamUbxo,o7Sk7Sm~($S?lfD  @ ŠV\)6Zh1%  3bA n.ݷ<ݛFoӘQj p1Z:qgXmJA(j،0@  B`] rq_6`ok j5mHxXeX=%ep7u_W G{3 ÷meZ^$Z(Gz{ÐT::ZV/ @bRM nfw)۶eSv.%"0tAb@ExJ2sEŲg c$A  @P6%FE~tA;qܷ}mw%_L tz=QpЋTAf^r+ܮ΂m)w&F &\SџYo7ة̺"j^q_'e 0:5P3i[Pb$"_?6WU/?~KOf+?0.e_T;"b)%ym*;;?V켮BwUzUq J[bt Yxkd4%ukRU9:թ#F`1*+?ɢ/$7mo6OS\07&[JKwnSJdR_5Oy$3߭RTA ibp +J1ќ%E,b!JUS):EqސU8D#fyʐ)TŷͰ̨>> ayGVl0jc}rիWMɵƛЮ{-ybTo* F5lIl +.rRBJ79C)KQCs7hr%)Rt@v v1Xy9czG嫟/MW@] jYa4T!F&. Q*U½C @[$.-B>*L ʁJ S010+^ vVpWβ0Uv7$T t'bRN1?P*i  nkW4x/!@..z 8HHDt%ƪ"xS%Ň+V:Rڤy`!vHh@~.kED@! 69!Оt1cEr+c(1l7vX~ =.Jӭ(yOW #Q^G{#}#^rP a,M^4OWKUӯbдo 5f@ 02lt9vmsJ =' bT\!b:u !9~T-%?o-Zym2[!Na>YځsgAg]Ct~xɁi&W[ʌef`?%]h/hi|\}LKV s0%f-dWDVs} C:}Uwv|Q^`/Y\o+;_n] |oCNʝ[Db[/Pb-i]-L MXP ذn Ű|a*IBCjw~.M/.[o$dډy O!.XM:uV?MQ%V;zRX^ X17o[κ=GRvw1kC &MAb(1.Sf,!>؜0YAtj7{rK]ZícVr_!Yv-m:Uur O$wn8e@kOmи>rC3 #E`%ƣu9ĞmA[Ð樅s-9{ه~Ma9r!# KrYy6P*TC6(WL9$8tNffɿW&|݌7{U;2]bU&iḟxtJ 1x?:~ ͻxy-" bJavMze:+Nz.gt%@aZ_hJ 9)F|+N&B+M276?9NZ\5b;\j/&F⫱W޼owcR 9e2v0TXiS쉎pb&N=q!R8 M ~<{LՆ} 8Ǵ{w(y. @ C`+VC׻}[K^Fn@Ԇo?Zsx+%+b5تj54ِ] R]jB}M'nrv+!01d2ff/2O9hЙ:nԟ s v1P_9GV2zarEFl- XL;Qw:*0Is=H@*kϘWmS6gn3Fd=l/#Ͷ6mdplһx74JF8fl?;~*v̫hdFU_@}a,,CQ &A~U,kT^x<[Na>'8џ[XywQ'UVcYnSD$SX܎ '@m3sacx2/^^ԩyF3J6G3\N]2glN}.ݴ4?E噙oKO^C7Ve|ɻMZ@7܅QBXG3<ˈyFuzC%-Etے7&jU4ݚy( }7&.yJ5Q$R4y|{A<.Zjq]^Y#2{bۯtm;kCaG~8QtWֿH/,3D])8<7+ѯFLEHlQ7 g&!!CjM-.GW^a""㝎xj]RzO|q ј*baE&S[H]u/]ob2[˻яjo XNQo ,ĞlXRE@ . .PK ^> ؕy`O ,Xʎ/50k9xj9M?xvVw*WW~@د&Z#ՐzA` өgU~:z+?G}ض.Cy.(hƯU p)/.;F=Ť7?;e@UMտrw܄ [1x+N \DT)`N y@e}w1P_h F Zbȗ\p>.>Vi/DoYk =6^ 2 l/l(e% ]T{r0vQU&] t\u/O:펖d]D2n1  @$. LǍ=nAֱR]f-=kk[Y{vYA @eUr8XbO=v8ea5dd.@ A`] tF,1;Bv[i^E2w%%. @Xκ/,R_k Ljیa @SCq5~8p+1 HMS;M]xA bUfn&U! @즋†n릎UbP~ɿ|#CfF{C>ݴb2cJUu|KF XwxMN%7D%. P(0mG%]OY֬Xb0ph^5P@C4@ ,XE+QZ҇ZF {rDmnWk;U_獔SI/&:Mmf/i3ÕxkAp40dKJ*B @}ast)1$и9" SDfR;biZ j[ C a/l&g(%#޻"K)u<؁ 8q$=<ϣ.pmyiKeNtvljPSn[[;]ubmF-f/hB(1 @#{/Kw{ ,U[eYUCH.FcIL6#Ut(ge5JF On9+mX}RmC5@аǔ1t1P_D`!@4jsX/[5BsR-RWXJ<܁Km5YEݤְZ'n+5\]fbLj]^l+P) T 3\?^Ri*bE/) h,,ù֒Z.9lhfo>.-|.|!J dV# MR^^nr_kvAC=ڔxO '%Z8-0y3{P0^ft3UpeEfgKj@FCxK]ʕ+W^599׳˗o0eamg8YrRsZp:NCȷ"E&s739鎒wR@TU#^>̓מ_(1!p8;.1uם;+jJ }sforbG꩕ZΏQg -CI5g1'o b0 +Z;SBJ H& <\=2E0eD3MTeKZ#͆Q^_S5T %hVhWŃOy.0pmTu1p}ĀIİ8@ Gk^E|0jdꯝ:CͶZcz:]FPCZ0I1haQ*OIFj8B;?>/NU j;]m!_je4] hDDC`7(1&8@Ö=E+bHi$1nѵs"hitoa08ӍIØq*@AI DŽL$<˥9&v1 t1݃жQ68̆ @%v1d t1 o% 3!@ e x] ] %@a` @ .X_a`1 (1 ̀ @"`4 ֬DJ @t1( t1L[%@3  @ -U\% @6};l[@zPD @ /i$@ ЃKxIGx@f 4bl1'J =i  @:///J^yuDZ Pb  @NGYY L @aQŶ&_ȉbN:}E gH9.W' W.&~G|l j4`og};{- 2 D\]y*sM{Eh'cXm@e<ܼZ7%lȔ/\_9 e}hD@-J Đܻ0&LmqnmBJwn\ꏾ*Fjƨkfߛ.N4cQ @"c] ĄIvEî‰3(oX D{ݯ vc(}ga_-2Fz6dmP4Ϻ&OS\Q܍\ʍp^@́h؍KeQH*z(^JU'T=1Wl8>Fdpi{X´ޙXWupl6ڸ1 #9KS1-ptxhM~o^wy-[޹j͓egjSzvRwh,SUtAJ2ig ʘRUĥyp^1WB!Bp} Yyvڕ&@Cb-3>J  !Ї@>Jj>wX3|_%= DNvR[hfզ$oDQSX)1 *=yx5 r%qD yIڐ)Gl4 ] {'@aa! m6<7.(AOq2X3eRE¢/s5"qXݶE,i5Z4H` aŨ1F Pbh0plkÔVɼDG56s[52ѹTP/F/^/ƹFq5[:a} م@s}NCP(1*8%V M36\[ &me&+"&zm3j4Geӥ֎6XmzVgLB'*/  *56Xؐtmy]1.t; @aAą}\pT7Hi-|ŷmHuu3ɭ6)LbB$@q!x%lt5oO[)y{;Wh/L~1Oŋ-:yxHF}c /md߉Y`@#;1P5+OIt1)%]J'^ hbH!Gzv1x gr ϘTYd)aU|o/F~,55E/WIq :f*wN{ 0w[ۡZKeWOݠgmF˘Z㑇 @`u} V)l%ͅ wE@rkvm0E; 4|+R<ƅU @ _t1 \J  1PWxp `wMWL/R [mbZn݉;n2yVE'=P* @`/ t1l00yeVC zOy-k+`ۍU n?Ef0jV[{ovz>L\QCQ8IIT@$0mPb3X52J #G @ $i^h{q,] ED@#@n]5d @[!0t1l%sƉ@`Mkڡ\7SeB ,I`q.%c\ @aq @ 2C.~#@K4A @3X`a`~(10=?~' 6Kt1QG&@wN_b @`6 t1L%݆  ,__0b PKC-1!QB( - z+Ć)!@*] ǖF"?#0;,T-@C~&_4S#@F U_0bh`(1!"P\BjmsQ lN bz`(1,Ϝ!p =gRT޵Owc{T8ֶ'oaz* @&@áÏM(14ac:`WR&H݃ #5^P5P³?okUP_k @] pxX~jmm{B@Q^f:L]K@ދLQz T8#@Dw@C;DHߑ{7:L@b0wHK[$en *2s U @'@̙q(1l=ؿm7ݗӽP^@%-x} ,+ȖvRwgsuvz@vM.]f!@a({ {E}]nwK-¾\+;Q zVHB % Ű$mJ #^@ @ (@!10p !@Ch9J G6Gc ]0 $A Kt1285+J E9vN஻ڹ@ pz#+ơ N rk_rիׯ_?9˗oO :sQf_>m)5b?􈜲mM6tG$EJ@f"P`dgJ3Gv Űa9 @ 0#fİ @ 0@mô =İ( @ Н] ݑp(1>8mOƳΘ) @b?FX8J E{ xlU+\5@ PK y@ @b8Dq+J ]q  P9[]A2DއT H.Mc0v|}Y+&l z"E@OFt-Cx[ @8 t\HD Rbt1ƅ[+& @aap(1  [z0C~rv @ Ű`w#E!& ,?M#p9x43:F2Ě:ãϡyIV50HfCJb^I/z{͸)nNq6vی5^t1̊$@aaũ['7hp VÀ%/)ŌJI+0|Bñ\ſ!iݦmT/ z6g!7&YfHǐp #ت^(n 1bt@@f#@lhQ[vZ 0ݚ5-Vov^n zvdZ%yE=SL<6cF$U%'.@bt3TQLIYOjUdh{VEO|C;C3BeX<)Sok>"^d]Z%Porhd&U ss&!C>5\) =c&գ wh*]RZ9*Dڶkdv;32q sL=UG%CN!V 죈#UT gʼn@Q+.jHV50#\Gr[H1M!frRLܬN-guU=S֎ {8ًN4'>NJ JɎ) O]j!Eq@@,1v+W^zk}\x90j^X']x:Nƍ:Cȷ"E& {țl19鎒w'~snɨ:KS٣mGEr!癤o͜*ozN[D&JJbhxQ;3{fx|"=+̷䃞Y(U?ף0|.Hz2+s,s.~.h>SaUL(TU6ch%"=Xpwd &o'?9GĐYc爙jVJ SXd4Y"1JU<(N(9oI^ytlE Ր@fjeAVigr͜X,({єo wwN33*#޶V:/WU 3d(1 Йb%ecagPWIC>uJZ<58xB )S['߂<_w0W1N )QzR4a0R1کx=I gmQ"&f멦X)VDdJP?8-:TNJ .u+Nū F. Űh  nj#4,}A-:MkCN;70%9DySãÍ3$֔r1>ۀ|ȓ6JۆXr u.҉9N}9ǔG* Pa%] h ]y:W8]ibޤv*0oI}7CasQ;ś CRLL)1l1ٔInLgPE̙1$Ęo_Pf>=I f4<#*)GۣwSN#ᾐJ>v~UV_%<(1 gE KQs PbhgH@@ڭ"`L@^6vwԥ8YPbE= 0uWVx5Y<+Y71` 0t1 ]J B @Ka)̳K< @:#LT%7!@ :t1B.Qb @ D@Z@%ZbC @ @!Œ] Pbe @^ŰHr(1,ǚ @ aCAPb$@ E.5[ @a QF@ @`qt1, 7OC @b*:Mþw @@#1(18@  @v@%ZbC @ @!Œ] Pbe!pԄY@vE.]g!@aL@ l] [O1q1"p,e@b  PKC-1!@ Caɮ(1tʼn2@ @`/bK$c9cLBsn6UݣWԊ>eHÐX4o♔Ь?5Yn%C4AC (1 Й7:kD `(0LNh0,y0S%Lt1tCpp, @ +B`_bWR&)={}"FW +y\EbhƐp}@P"L 'h!/Eb}Ps¦),4T LƶTo{V>yLu1Ǚ3. M&@%@|Y ֥Y J官[g; S4yx#QpJ9h5*fh6yE37?K4)(rS)9Ve>.ٞʄ(C},<?]/RqJQR tw( M-2OY_MZZC3;fDXшd\NϘ?LMy_@;Et1,)vF;# k{z^Ra_)ýZL2/5Px׆/ZHo%Wk Ma%9Y|72JgG›#$"v"b0GbGk.wa%QUn!ld:)m~%^MffA ֠Μ&@azD DJ{s-sG' @@J@%ZbC`}~}zh֓Ha& @ J.(1CC|OeB)4Kvo41@ $@b(1 X@G» xKhOIhhVޏ5(wRce;8*-$]kFxZ%B ] Nj9O%@a*AC@Cr!Cax] G$CykUJIJ=֊e&r{ӥ4{{D}vC 6t1qcԑ Pb8r @H@-J Đ @A.C'' @ a/ďPbX53A @"@Æ0H 0 @] ck@ @bX9n%͇ @ 90Ut%} @ Ft14c؁ Pb8pq @H@-J Đ @A.C'' @ a/ďPbX53A @"@Æ0H 0 .+QҘYoR;oaHa!v6#ڒ+]lnEqVzou)WRUYI%O!n?kH@`h_GvXL4w*w?\C4:7)UR4,0ϼ;-~11uSuoyΩSf+H+Oyδsphxx4p2La ׻K%m4ۣB]x$E;{Q."" @J@O&P{v+C|TIJ/Рӵ3^rMh1EVo^MP$ZTxxbΨ1C tK1@aGq;5@kR?JCzOV3%t4jX IxP*עt1tCpp, Fv ''vע1 n݌@q+X+K?ԠXu1IQmyGQ1\|]H{A-R4*$*\$P5r'NsFЗ'ڎ@V 'pnb_j):vʞd/o21SZ6#9y4SEL=el CzYKOޅp3r!'@ f$`NRV2F'v1VBB nSWZ|Ks/=J Xݸ읋h1Oq\%l3)af Pb8x> -H]}5/Dׇڹrʼn(jmˬʋ6hϊSA@B~k-^Xn!ʃ@ykGYkfSOrjyPb  @'6D5.3](f= /ZҦxQ͛ڐ _f]5[ e1j@mdgpU>d(ʧm&ţSCRyFiv^Fө֯H*cb…0 J &Kx.@`8\xj;ovWwQ%Foiꪍ]r%Kӌ ~Uf*SCG[0RfdqH~߳:S kөPFG+Ч* Ӥb8DJۦvy7劁+&|*F}8o9Ǣ <=RAC (1 @"b - PKC-1!@TK{-4CmHnCc0ZC >tlT ,8|9Š„H@`3.yor  @`bb԰y] @b4050J B@_޿٣®/ό @`It1,IA>0kr/ϪR,Hc1 @P aH,G@8_ΦLyS7$1 aPbX<4/eq]:]F^gp A  7)3vT.#Q ?!@X] fİ@U^A _~e{mE-Ybj݁^5Sj]ܫ?2f]uԩBkDx@ ŰŨa(1˟D u;Y~vV}a6iTs( :SJRAk5Rຌ' @] #E[A6ℕ%uU[ N> g&mmR-|G  @@%ZbC@RB~FE[V*3ʻS- "@Lb ,jwLÎkWk"2ޛ@uWZ]fw:^`7V5X/3PT @b8@q3J ./䵔!ܧ^2Q`o.O #!og@hI@V!@*ؙt(1l:|m|?a= @{'@# Pb(ED@ Űz0`s(1l.d9]?oxL[li%`@F'@¾Pb/&X@  @. #@ac\@ @`t1,ÙYDÞ/ @@7t1tCpP( \6w4yʻ]a Yz5i2QEq!.J*^Z8 ۑcsat1,wJ ;".@67nT90y f Vi’\5o`dL#ʭ*ږ@/ xΞJխ {] UQC% @@] j Pb%< ]cfr}[nL z<^v*ј-c|QahXԌt ,JR_5t-T y+gxuFw0'CwRӥv(x&sRG7kDU*a2Gc]MQ2E\1.( z{6L:wڸ1(19@:PPR({Ch:C4 =33jh%_1<Tt\kSAkn5&EF!9%SMeG1&vfDzْ9+Աˮ~Z@We@Uz5G~e(bzje13DPb8ZO`չ{4hkb'0^)Ź2ű1ӵNj-hxWtSTƲr_klzUp@;^4/H1qi6ha>>`2h)kHA9@%@| s,tz5cy];eAX(FhfK~dB'Z;qy)j\rJ4%3Qd81=R7CN! J¨Z֚׳Ӛ*W3{̈́f$'טݗ LU!\k`Vȩ(p @`:3D[b- jdRh'@`!A@hb-"3>qK ]˟zc4{seWs05Fս-l>@? z5If6Ƭk^cޤ 9^r=! z:zѬ/Z@[;<*OC(9J ʃ \s̼njyp pm}*f1Z& 6KDfa1uC<={e<] Đ(%*>_ 9BN/OD-JzqW*SjЩbSru\2uuVp(@yw(28 @`Lt1%o:vX)\W:U \aHJ)Iw2!ۢKY.씲v0 tr*]nnmޱ: lZ5@ƇeYRddSVt+OPg4^RyD Ʉ~e^VegR]1E<(īթ虪Iݐ nt獦u?K.`Ն 0bQJ~׎-A^JuVj^;s:8kjW!h] ED@#@;eSn,%Hٖ}vL&`O"Mvh7+èdtBd,V3ܛbʻ{@,?ܱdSK\m,:W^5u1]KzҒ̡ z|HDÂ=dizw$0%FR_ӼIŬsӾ7U/|QƔ.%( ` Pb  !Ż}qzj{^C vB  g'7 Z@c@qsu-0%S6 CbJ݁G<"ϼ ӭvo+s-^50"a.(1Cpʛ=ҥ#@@f&@̀QCvT\ @a:C4%E!@ TC*8ݨq6͌k4#Z` @`ut1 AK !1{sI"`Қݜtz @`at1, v@`5{M7a¢Tt`6夫ҩUH2) @Pb8Tq J ` [(:⾕Ԗ}Dw=1Ŏ x‰Dŗآ6 ] ! !@LyzۣPk_)yx?Ӏ,E}Ѡ AzT_m~Q X<== kA|;B_&#{p,! 98J O_h (5( A!`kCpz "H bPԊt1l4p"-MK4Fq3:]mf ;Q(fWȕtuvϑжϊWK*\vTbX^pw;*|mG_DmKѓϋSìdieKP nԆ褲Jmc=|er!yz[K׍w@.U}.h 98 䳡%W @Dt1LpxA.@ @.h 98J O3/,V3J| @b8dqzJ 1Jw0y3ٛEi'b @] $j pQ y4? c3p[NW+?f @88@% R-D( L4 @)t1%p )t+cߌt4ȓ:N @@H.Wbqń<H0L8xJL1 @Qt1%pC- !Ѕ@n܃5 "{.F @%@Zj!jmey޲6;8 4DT@ p t1#xٓc^5`pL{fu}rիW_~rrm069Ի a:<#HqCs7hr%~?4=W2B!PUe0#~4!Et1l1j jI@ 0).g !'$݇eyDؔTTj]=O`fRYe[Vrtv>v,7O @`Y܋aY̶K?_%e]2%hq> <*@7/\OoK @a`٨/1x5*PRMCUBX57W8Wjy[Qwdh"S0  _'UæC'@ Qb]uGƌpjм,;,`CfJrDS@`LT/}bƄ-Qj)yqzs99r/=s5"7SCbU!0J!eD0΄X׶)= }a@ 6Lo.p5`eJQ[.kΪf77@͎u~4K5G6/=s~Lnhb$*2p Tbwk?59x*&4B`C{ニэyl @j#`[)B=QyBפyAk(5m%ˬQ&*Q$!\ e_Qz4pD?G*LUx5f5yrpHՎe}C3K)Ll#L~ט{7nv =y`#ld|4t!_\]v)cw#CTA ϶C!,ȫ@Ϫfo>b( g>Phb.)5kPC; ͒)xWXD/{8M4?\NgZ.vQ[^~&3 16_RudZ4<0J6} .}\Đ[tŶ]k\H)o.UKX2d/Զ) kD KKbhj\(ȯJ-?<'NB1QmU"|3̼Sl.mg͐mFbRԝ恙\K `ۨ%#Fp%|A*[l;)k&E6e]}DvL UAp"<fRN!ߥG?)ڎᨔ~EJL9K2E59g׸'V@f>'Reo҆*(lʕW^~9|~9e)NadmM6tGɻ'''E=)c Ԕ-c+nsE' Lď`Rb>R{bOmpZJ @ "G`q7kLZjo @'Ppp\C` %B6) PڸPP^q.X}& l@UeB ES;#nסЍ5 @ PU_0@Bqh_1KX@ ^ ۉB(ppon^ # @E.}o @b8h~/J4O9 <%3 @`bl0|5GNQWzXߟn !@ ct186J 3E- @ mt1l;~XJ kPgN,N@hr{JXlϿ[o @ aAą PbX8A`vZ߸4ub`T( a#́Pb(UdJ ){&/R[ @GX%%i3 E k!/]ypya%uQ];=%*ixX,m%% @88@C4@F]^x%"Ow/eIl߲F]о`ttkݵ-S.Ɛkah~FE @f Űaj(1K+)uBT yF()xfEfKz1 @b8`qy"J 2Z]9kgS뛰mm{Fn@vB.7$@aALuT+nF8]a"+7c}f%֗^ɩ۬jZq@C!@A PbU @~ŰXR(1,Ey G M.bK[}ʹG5xymy@&J= pƌ!bԬ_֒L&.Ӣ_B5/}P̟bJZ+ښ,AK 9@.@߶yP `[ryxKPYh*!_Yy ֋;Upwp<1eeZ c_%}1Q5ɬ 't1) J pf GJTd 5yl@WWU_V G0Q;\X%H9u>j݈kI!sX#  pt1'xڋ%^$.7[I3Pea) mR G@8PŰ (1, H{kSt1Yzu( UV /Lk&ϋZKֶjGE F}A%vV%YfmrDR[ 3eL&%zdU?;SZUWF?M9$x-F!b'@a! $mj)c-ff 㩋mG7Pt NT8?_VMztXSj3?E#1Gy{9Ct1"%@D }Joo!?mu1).ݽemE{t`TmH!z @#:DAPb8HqsE]q1l0aԊ B.Q"!@a;R@ @`At1,vBN @@_t1剶#p(# @ PM.jd 8<J O@ @1t1%@ @!p0dWD @ t1%J ˱f&@ @`CbP0u f@ @cax`PbB @X] #gİ @ 0栊}İ @ H.Fp ;0J >C @it1%@ @!p0dWD @ t1%J ˱f&@ @`CbP0u f@ @cax`PbB @X] #gİ @ 0栊}İ @ H.Fp ;07n8S6[v+W^zkͤ~2#W;jS&n#o;J=99)A ] V^N&n`ΐa~@ @`t1̂&@a9@ @@mC<~PbO,dX_rgyJ]4{@@ 0 l%M #MkUa1B9\)$dJD; @`(t1 J FnY+QIpXeJm !@Et1%@h$ H;w=e'spk;cϫFͅQCQ\w0@D.#E_Ї#Z '   ]1m7O)(3f^WL^w}ԏ:ʈn1%Ers@@t1tCppj&.zE~~o)b _(gn/Xx @}c<`|s@ &BAݍŊ@tum+6b^3cfHޗiFt"4#@] dj Pb%< +|r޻~v- VT mXɌ/{yw)^-3FB p`t18HC#8A@I e,݅h oO}/JR3kZK*2ں@(+7hMn"kP 1l˗}x@Cap`&\q& H,F\]ʕ+W^~ɵ~nhWO&w4`zDNYzL F(y䤨 1,0Ou>ܗ v1Xy9cz! @`* T>rc3|C8|IOkZ%y%7kfmƸ4Skd4 C5^ O52]٨=dbh LÑC`E{yź>bGߵ)~ mz+ia6KF%-۴wWt%͸1 c&]h&JA)nzl*d/. [$@aQf@Uws4Wt@M.} @a 7Q7t{7Z(F' 3JS5CdhERxܼnh(bw̸=p6 4d;Kͥb>Gdan(<5&0F*/R:ً;5;5!EX:xf@5gPi$MK1aͳŠ,J $ $y闖R2Vr[w-䍲gщY>UmOY\^^ /wi%5¢-pؽPhvmݩSx]~׆ԻELڽ@-@JZ[R4 glا!v[K@w47ҙC9́4K2eLt1;&@aŵQx4mR9bp 3pT~ͥh *Sp{jGv]K- \a $LLG!IDAT:4_hc3^2StI=:3zL =X Gw5%vywGщxQɹob%<(1@SLavu:6i=Q&eT5c,~.J4*6%;Qa OXwb81O3K[QmQb)Ɉ0!F"+hɃԨ n0ˤk`gUJB3+Fg 2慌Z^-hE/zh,l WLE%y=J%hRUڪ /6D;p3DPb8Zw}r |tJvD/&2юZ=mc R.H+Ȍ~-Д 2ynqHQވ!ɢ@dr'rE;s?Q/ฬB㋪涪!'yyg/ntRZ̼D@&W!E,IFfE#b("BJ % ߜӮY"ltf]UYQ{T}NrͰ ڔU޻.#QU]:쥵'daU^a?uFʻ+djT|^IE4֌採wQѭ;(3"fWoBWiFT)&|#+y(P #(:&FV[)줋h&Q%^VDsh[-hMDEM:I%O'ĠFVp`F[VEѣ;Q۪\\ƺjnO,0gr;v35'ɻמ34#:LaseBDruoz ) ǝYqͣ$Q+;|nD5 Z>rBS㠆Uw!wvNB*=KK}֝{j%)k3;QZ~I|3$ܛW2kf/.PD<H K9~?7LFLE a˭&ԎMՠ0 p4vY_CD!@a7đA z =o]{JdTEx xkZ,H)vLdx!5xDQ pY͗? wPIפEC{Q[1'9.e M܄[w:]-ڣW58Bc"4mi7F%=732[Sn컮*7Iq(OZpI¨Dfa+3@B% 8F ;6KÎf"yiP 0%ʕW^~Zw?[ckňS x ="ߊ<4x=wy&'QIQ 2)G<mut1l=8 d @H` c8v@ I!u`A ( p/%( ` Pb  +rcBwΘ sڂn@'@1İa/  s @`bj{=cQ e0iT> !@a`PbZİw7DyǏ-@dt1LFp~aD1Qǭ,q|%2 !@#{̲'M|zAs- t2sc+  Ѕ]0P(1*8&Z-Wb߲J2:EEJ淄fx?yCDu'?Ew+%i3 @xXz_r[v@IˀoIbe(ǭAȻmj 7LRxsdž [Xw=Jdݏ @=CJA /g1nf,Rbv-D Uň"pIހRf(+@Xpf=İhˈun0*~ퟱA#kVU2UfDR\X` + mG @Q!H}AӞ]7Z?U^ɄJ-y>FDɸi ;5KY 5A#{ =%c%~gn 1: "ʗůߋ)UtޭݥX)vLd&/|%$m 4#K7n|)޳f;qkvzt,x9ٯk;{r\hJ 0& 02 AjCxnO>܏t oF+/N+}3hu0cjO4Ǧ,Qt+si חJT< w-yf'uVyƻ4SG)eB-V=u/e|Z-T*OAiKsz kfsUtΔblbآ._# h"KSX+lɥ*-t6Y8ֳ獳іu%ֆv6ˁ7[ylm_ƕa6{OEZJ5=ҌYzLz7Z᳝.E26^B\+gGt10 ,.%0m[Ȳ 馆 PM{\q6bmkߙ]y8S #9rlG=&{x ]iIMl6ܖT l9.0f$i^\ {Q9)S#p%)+ͦ[ s#=&vX B8m%1ppv v)J8, g6fK3 o]=ÿdv9G=OM>  ȶ]ґfh,tް[9{oSkUk`rDӁ%x־gzssPMʪ[R ݍŹTggq*gKQNgY4 4{lXOKRiD YRgzO=ev H1X!n+Aa{x<<M Z(!kf[ )77 2VF6RYe0ie G03 \_5~8WYq6`Svzo>k+W, * I9YKK(tyoO9J5#; o2gg3;wy7g\T>O!Y_B.مT"j>YN2Zͥ[<@:#@`G)15E`F fd_ʄ/HχcXlivv]Nۉ# 5gJZ9tuqv1E$nI/ ps=>RǢslWBlR8 m鉥 W +]je?{w=}? mzcF.]<¨n- pka@zȖUe]=c``i!N=6g*1{)8f'[jv' ŜB_Q^geي$eb/ʔG}ڑ.) psW] ! @lbn}15} lWdg㳧K\v:472,]0<Ґ>JYʺWi% @ D]L0OQǷo߾{Ç//u Eb#?fjdX6ec< @qa'(O /^b \U.άq=c OA|A/&  @Y@agP@zc7<[;ߴ}pS^ @EGfȳ#6 @a)a: +6M[ Ə7.G @b8Y§ %fSpINc=%%isD.SvQ}) @Gb8LiOc#=2Hgƿ7a# @b8漈{g~Cȉ._ oqz.a @bʓ1;‡Up%;Ig'bx;_vaA&BwxzzF\[@kt(wgdop.~bv?-Sfbfl~!-)þE!,2¸Yˏ,}B% (y pszs,z4pGֽڙvvl?q5EcwQϥwyAŏoNXPX?u02MWuaAVM»u%8|o}n- pka @ .JSP @ H1\` mC"~׷ǏÝcng+ʦbT!~/?O/GYץPC uR=NԏrEX.ȥmqavT+ FY:2e<ߦwhS>NVE59svKQ-=5siϊ5yg 4يtNˣȪ\:*~-b  0/p] E|e'of/~v!g_vǽcwZj*VYHvvWlgzZ1KtDH`ҧ%Z]ZKiUTeOq[g/{hgjczf4viڍjEv򊀬f gk5uC;~ !l_x[(ofY=?9y-U, qu5#@;" ?Yq^ P lR 6{nzJkSMlxŐJj~x%EVO_o#J 7;Z; o{ )ܚC^`i1,X1jxrtý?|Eupb.ѪE[V1`M p); fPH :*lB2Y6mPbz u1m/b:w]- /zZ;}ژN~mԵc- }y.E=څw"Ξ&S0>g]Ƥ?]%kuǦ}ݭ@sP ~-R [%; U Sx9CZ \0{lY4^+Q^4%#fce0e=ġv1ĝ; g xyR_[Z?_Gْ#4[v«n_۳m.5Oz=Z]r}"WO O@h10#_I:T:&oY`ժ/ojzɞյjR^JX>,l6R^|A=#)Te!J|8]kKgIɦ_4ߴzY? [|jLg 'KuReP*AbO 5ʍW.N_ -_u~Wv(*ke4c7] Vivi>#dO5K33,/u#K+ϊ=NK u6U)P9 ޳# &5'tiwxPw=KQvb؅Q#B,Й)hPp K-^CbFe] @)%elAȾ.! ۥ1s=2/WhN={ W4'p,q(]  @D@&%zz!@8L^.O @ H1X]  @]¬$v1X @b#8 @Q)Q9 0*`èz @b8K pi5( @)k#@]uBD.ˁ @R V"@v1zzG 0* 0*FbS @R  pI.9E `  @v1[\ @wb N `9 @\R@jPZP^~=5VkUu ak*{Z)cHGhj`Unlac}yia6m>}խuhhB{ p;_oS~w3wxLjG{ɥ !H1{.G}ɍSyp .!@ CbX @R -p] &xAӳW9I?"+[ ]gng C~U>ikiَ㌪=W`HM!f V@([seìfSC^? v+~gue -;в*񷕃ggArCDTȝ #KX pSؓp[ H1l7 m/O߹ƫ 'YSnZxgi?frܳ1xiLR%4H[xf4遭K<<2laҳ\~9۪J5KKllĉћ] KA`͎!ϜkVUm\cϴyxP1)17 ëWI mCy ]?9ޫL6U=.!Cz=P7F,Y1T=f_NlkM΃9}e9| ,*=h9~%j՟=AV:he\=;McO#RR kŔ'@VbOtsȵ%ûI]D=1%|y/\1\ԪWistP;GtbG:ǥ-#  R.6C a0v5-+.sFTv^g/Uc_Us m6 v*]z\2Hv\;@yrN$ p*\`ڵ~%p] tRc*[AO/zȒTsxvidmig#]ґU;f_r( _yy@k71K>Oވ]( @|]$0e3>ckaznqAxۚ>OKCnVbgI=lwzll|V,+a}+%&KZq٬=+a6UWZj."/G:P%{F?WrϳߪIYu 7t,q6xխZ!CX5  / o$c BH4KP=zxnZ/`z35 @N pIUǿǟf'wz,4߼yd6K pV)Μ 8tD<0O TR O5K- ͟+ @-) @ @R H @ @-) @ @R H @f|fi @ ni4~_2#F @ p)#̂xj4 K  @'b8 K£ ?C﴿.Ss @R gA p>~q:];c1<$PkmO\PI @!R a) @fx />*O֧,_~]cU:. (H/ǘũ $K󂎲RTzII{L'2iwO/?%  @ H1\oNa/}7^\h6T>mZ|yyn9mJdM 4VL,mO/{\[u4K1oljؗ`*l,iy2S'^vwi Ec,I݄R <>/PHx@xrm)?˃ =!fU).{ @R G1K@\%{)X~owH1s,T$@o% X!Xk C5n×:W Կ06TzB#ꌹ9ٮ\a p{)'?\z&@ @Wbܔ$peF2(3ER  @'y* @G`Mӗ.Ӧ2- p5: @=?Ew\H@Bi( @')dyz H1uM @Mzv.en B@$H @-`ýw~)ϡ @ pnɋ H1\|  @1zf)g}c'@ @`Q.Z)b @ v1<4R rj @p4 H1ZO @H.MP" pƩ~ꕟoԳ.x @v1\~ pw?~ܽQ 8Z2߾}ݻ>w[KߦOQ8t}e|   @8”\8HR 7' @g] g51?V@z'@ @v1tbu`)O @x] Y:s>7>9P#寎< @bR@b ,+"w=fǺul/V=4N3J] VMy5N @Yb8̉qR 3 @# H1tbE @cbx( pY3 @7:Ԁ @C.=\?~\#6Z-)]=o߾}݇^^W+n*jsXh_~o[iMk2pVgR @-bLb:y )SL *6߅:kn.4mCb6Xe <'P7XbA$)  K@X!Yܭ?vg  cUl 9–**\.V/akh!=0݉VeUÌ+Ɩ}R:XõbSiv _?-ISF[T74plk},QQڿ>8w >x7jFH1\f*  @s%w] we.3B=OKۏӧVV k'^m6rRԳ5wvVZFAt){,*\KS֜N0zk TH0jsr~9挧zMGW j޴@:`,gj/l*9 od6Meρ3eKO+aU;v1%bx6R w7zSaTTwxOHڬqqgK֛H-:l0(g}i /~ι]xpZ-tQV'ukfù"BŞqͮ~L/n_zi. m`wgKχc 5;/ÛBs)<ZNŰP'bxI7d#Çɾ! oO-f[-%m̰͒IsF-T ,5Q6 cbnn=;sU;i,=P CuNv1\yv6R q*>h.4oCCܿU'qtrv1gaJR WMc!@+tn_;Mھ"dȢ7w ŰP'bxI7d @v1 [)+ @3v1X H1S @g P{}ͳ'e) @ӵ=?? e@ @Ei CA@J @ @vbQ @ @5@ @ Ű& @7* @<K61ZsR  @iNA~S H1<4 @BfA~a" ,3m @l_n H1\xr  @O@~z"@ @b @D @޽{?h N@vZ&@ @vq ~?~_kZ"p_~iS۷SzzYyyyi~oOhu{5)ͨ @$kD`] O` R w@ uIeWN.. p 6<Y{g}c'@3l։1S@"@ @`7]v1xǏ9 LUx۷o߽{oW>.0}J3iMn*ßNgkw6*^=;C /c"LOmAyb8 &i&hRa1 @k%޼yjR &0 H1Xrb@Ch.l6G!@'K1XS H/J8AB"@.Y@gS8F @W&.:UN-/JzO @"`QfB] Ǚ  @ oqi_ ]v1)y)L&S U5D%~'ߖs/?, G\( WL9wmcE @ @R D`;2dIN)0HN9A R 1Fׯpv1~ډ @Bfag>OY,Cfm$@0Lo@´i#C:یV- J{)ߢG.9%K H1\xr m_zmz .6d̖O5 @Ӷ>קM }7!P6<yrR1^VL@&pW }a>v1|~m_ixz65 @>~ F\ E)tۄMR B/6]"y;Ê %?4Z:  @u>] 1$o9>nڷko dp no L>7}jݿ|iGÊ] U.n\ť0)?|-o9֖?(pI7%:!~'e^?|}o~!n&Li,̆2p  p7ON۔Zxj3O}?g#5ΆfL],v7t4_] aޙf!ⰛIlHE-8Q AqNwfT)Th-PR…*ZŨDU4 s;?|;?sfWIPǠAZ8V?GR6q|~x7s#}7C$D= ą_ UKLƒ]#ZhSʻy9d{qG$F~~B59mHa5)9i\cɠ%s؄O Vo3~# ^ru'7AD 3י۹Fq.,so)pl3ћ3+CVԼ}0gv̤h.Gv^*Ce5 ޫ#OsCZfm/ .E,D-oo` Qkwֆڇ<H|95כA Z["ChW>GLp$\xV<'S#v$Hf](Zee!yE,F=٤3J]:jKt߉ӬsVbK+ uGVR*?b%>|j/D$XJ&s^  *k~un"i_uȩIEe~ ފ櫀y!WZcd]o+օxGT'kcƵiϻQh:[V7` olKI\圎05^sk|G3fT ;DV9P+(9#o!gQ_iMfB1afSyX.BwHhzvHI8KnC;%~"l7+iP-n 4n->ݼp4FE7/l'8(;v=|sDd^WiS|sUv+R;wփΉ덣>#x0툿Oؗq_o?w ?X8.E8wǞF= ~: 5/@o7/U#~ЊA,GmzL-O*UAZN[ߞx:xfϔq梸bv'.*W̋̔HN~n}Iܝa}*3cJ\ϐu /6[q9&$ iJ@J Normal$P5$7$8$9DH$a$_HmH sH tH R@R Heading 1$(@&5;CJKHOJQJ\aJ.@. Heading 2@&;6@!6 Heading 3@& 56\]:: Heading 4$@&5OJQJ\6@6 Heading 5$@& 5CJ\22 Heading 6$@&6]<@< Heading 7$$@&a$ 5CJ\<A@< Default Paragraph Font8&@8 Footnote ReferenceH*.O. Author5CJ\aJJOJ Paper-Title $xa$5CJ$OJQJ\aJ$4O"4 AffiliationsCJaJ>@2> Footnote Textp^`pB Bulletv & Fp>ThTf^`p:OR: References$ & Fa$CJ.U@a. Hyperlink >*B*ph,B@r, Body Text6]>V@> FollowedHyperlink >*B* ph0"@0 Caption xx6\:P: Body Text 2CJOJQJ^J:Q@: Body Text 36B*]ph<C< Body Text Indent ^:'@: Comment ReferenceCJaJDOD ParticipantComment  ^ 6,, Comment Text>T> Block Text Z]^Z6\:L@: Date!$5$7$8$9DH$a$CJ<O12< Style1"@& CJOJQJ\^JaJ^>@2^ Title##$<5$7$8$9D@&H$a$5CJ KHOJQJ\^JaJ locdefghijklmnopqrstw  S.R        FPcdefghijklmnopqrstw z       &&&O&&892HZm!3E^k_/ tI<IXv/ B#f#%v'( ****+D.a./00}1242B4V46|88f9D;<=%>1>t?A$ACCDEjFGGKMMqOOPRRUWXZp]y]N_`Y``'acdeffUikk~mpmqqt vOe [\]n"0"0#0#0"0"00000!000000000000000`00000000`00000 00^0^0^0^0^0^0^0^ 00<0< 0<0X0X0X 0<0  0<0B#0B#0B#0B# 00 * 0 * *0*0* 0 * *0D.0D.- 0D.- 0D.0D.( 0D.D.02( 0D.D.0B40B4 0 * *0|80|80|80|80|8 0 * *0%>0%> 00A 0A0C0C0C0C 0A0G0G 00M 0MM0qO0qO( 0qOqO0R0R 0MM0W0W 0MM0p]0p] 0p] 0p] 0p]0p]0p]( 0p]p]0d( 0p]p]0f0f( 0p]p]0k0k0k( 0p]p]0mq0mq( 0p]p]0 v0 v0 v0 v0 v0 v0 v 00} 0}}0~0~0~ 0}}0* 0* 0* 0* 00 0}}0Y0Y 0}}0 00ތ0ތ0ތ000 0 0 0 0 0 0 0 0 0 0  0  0  0  0  0 0 0 0@00@0 0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000:A'/oOnrtuvx|~v|<Y{xanܧGJw]ت:=[oqswyz{}p9R   ; V Z 8SWXsxn)-Kfk49 5PU% @ D 8!S!W!!!!#$ $$4$9$D$_$c${$$$$$$((((((+++667777!;<;@;?@"@GGGbS}SSSnr!LQRƓ*QiޘDԛ,u ''' ''TTtttttttttXXXXXXXA W Y z  RT,2$c+jxcVΒsڅ# L0e0e     A5% 8c8c     ?1 d0u0@Ty2 NP'p<'pA)BCD|E||S"@S0 (   . 6.?#" G h F C F#" ) n P S P#" 5 h R C R#"  h S C S#"  B S  ?a.V4RdIyPu'8,.,S,R&,F,J _Hlt493059630 _Hlt493059868 _Ref461868133 _Ref492889213 _Hlt493059715 _Ref492889230 _Hlt493059723 _Ref492888819 _Hlt492888833 _Hlt492956323 _Hlt493059632 _Hlt493059711 _Hlt493059801 _Ref492888862 _Hlt492888884 _Ref493044699 _Hlt493059642 _Ref492889417 _Hlt493059655 _Hlt493059664 _Hlt493054039 _Ref493059675 _Hlt493059699 _Hlt493060585 _Ref492888948 _Hlt492891306 _Hlt493059682 _Hlt493059847 _Ref461868225 _Hlt493059784 _Hlt430075074 _Hlt461867839 _Ref492889900 _Hlt493059870 _Ref492889162 _Hlt492985902 _Hlt493059886 _Ref492889353 _Hlt493059625 _Hlt493059728 _Ref492875142 _Hlt493059761 _Ref492979200 _Hlt493059628 _Hlt493059646 _Hlt492887709 _Hlt492887710 _Hlt492887817 _Hlt492887818 _Ref461868258 _Ref492889399 _Hlt493059732 _Ref492889650 _Hlt493059839 _Ref492888922 _Hlt492891311 _Hlt492891428 _Hlt493059852 _Ref492889676 _Hlt493059687 _Hlt493059850 _Hlt493059881 _Ref492888926 _Hlt492985898 _Hlt493059843 _Ref461868211 _Hlt493059781 _Hlt437064248 _Ref461868157 _Ref492889432 _Hlt493059758 _Ref478186209 _Ref492889380 _Hlt493059736; + kkkk(AUUuuuߘߘ ..PQQFFFFxxxĞƞƞ**@@@@ @@@ @ @ @ @@@@@@@@@@! @$"@#@'%@&@/(@0)@*@+@,@-@.@?21@43@85@6@7@<9@:@;@@=@>@CA@BDGE@FIH@[ + jjkkk)BSUtuuߘߘ //PPQEFFFwxxĞƞƞ')*9 ; [ 8yn .>CKl:5V% E 0!7!8!X!!!# $$$$:$<$C$D$d${$$$$a&h&s&z&((((++6777!;A;?#@GGbSSnnSs !Rג kr(+2BR8AQYRZxÞƞО8?KRX` vw_lrѩ9EZ ; [ 8yn.Klr:5V% E 8!X!Y!a!!!# $$:$D$d${$$$$&&((((++556777!;A;?#@GGJJVRcRbSSz\\taxakkppqqQ~~~~Ss!Rrt vwT_uxO[ѩ3333333333333333333333n ; [ 8yn.Kl:5V% E 8!X!!!# $$:$D$d${$$$$((((++6777!;A;?#@GGbSSSs!R88 uwwx A.J. BrushfC:\Documents and Settings\t-ajb\Application Data\Microsoft\Word\AutoRecovery save of context_FINAL.asd A.J. Brush8\\anoop1\papers\Chi01\AnnotCntxt\context_FINALAUTHOR.doc A.J. Brush8\\anoop1\papers\Chi01\AnnotCntxt\context_FINALAUTHOR.doc A.J. BrushC:\CHI01\context_MSR_TR.doc A.J. BrushC:\CHI01\context_MSR_TR.doc A.J. Brush^C:\Documents and Settings\t-ajb\Application Data\Microsoft\Word\AutoRecovery save of 00-95.asd A.J. BrushC:\CHI01\00-95.doc A.J. BrushC:\CHI01\00-95.doc A.J. BrushC:\CHI01\00-95.dockendras6\\research\root\web\external\ftp\pub\TR\TR-2000-95.doc.5ɨ$ؾ&+H6 r`$s?NeWm:deI %UA&I.!,,yD~KK4rP /^0,NFU>e-[p1..         `\         p        ^                          ^                          ^                                   ^        (ʌ                 ^        ^        p                          &_        p        ^                                            _/ o w,abdnoqz{}.?GHJR[dmvwyŤΤפ_rئ٦ڦ :;BO=>Oe [\]n@HP LaserJet IIILPT1:winspoolHP LaserJet IIIHP LaserJet III0C od,,LetterDINU"0_HP LaserJet III0C od,,LetterDINU"0_sP@UnknownGz Times New Roman5Symbol3& z Arial;& z Helvetica?5 z Courier New;Wingdings"hIFIFCI&C!xx0dM 2Q%!PS-Adobe-3.0 A.J. BrushkendrasF ZZZcontext_AJB_9_3AM.doc##0t-ajb_t-ajb __Oh+'0  8 D P \hpx%!PS-Adobe-3.0!PS A.J. Brush3.J..J.Normalukendras2ndMicrosoft Word 9.0@F#@`@'@'&՜.+,D՜.+,H hp  Microsoft Corp.C %!PS-Adobe-3.0 Title 8@ _PID_HLINKSAT0>0http://www.thirdvoice.com/_Ihttp://www.cs.berkeley.edu/~phelps/Robust/Locations/RobustLocations.html O>http://www.cs.berkeley.edu/~phelps/papers/edl97-abstract.htmlC!http://www.microsoft.com/reader/%4http://www.e-quill.com/>6http://www.cs.wisc.edu/~ghost/e70http://www.adobe.com/products/acrobat/main.html(x)C:\Annotations\Figures\Study2Screen4.bmp  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~Root Entry F'Data Si1TableIWordDocument&SummaryInformation(DocumentSummaryInformation8CompObjjObjectPool''  FMicrosoft Word Document MSWordDocWord.Document.89q