{"id":381158,"date":"2017-05-04T00:00:30","date_gmt":"2017-05-04T07:00:30","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&p=381158"},"modified":"2018-07-19T12:43:25","modified_gmt":"2018-07-19T19:43:25","slug":"writlarge-ink-unleashed-unified-scope-action-zoom-2","status":"publish","type":"msr-video","link":"https:\/\/www.microsoft.com\/en-us\/research\/video\/writlarge-ink-unleashed-unified-scope-action-zoom-2\/","title":{"rendered":"WritLarge: Ink Unleashed by Unified Scope, Action, & Zoom"},"content":{"rendered":"

WritLarge is a prototype system from Microsoft Research for the 84\u2033 Microsoft Surface Hub, a large electronic whiteboard supporting both pen and multi-touch input. WritLarge allows\u00a0creators to unleash the latent expressive power of ink in a compelling manner.\u00a0Using multitouch, the user can simply frame a portion of their \u201cwhiteboard\u201d session between thumb and forefinger, and then act on such a selection (such as by copying, sharing, organizing, or otherwise transforming the content) using the pen wielded by the opposite hand. This approach demonstrates how pen and touch inputs complement one another and afford completely new\u2014and completely natural\u2014ways of using freeform content to \u201cink at the speed of thought\u201d on Surface devices.<\/p>\n","protected":false},"excerpt":{"rendered":"

WritLarge is a prototype system from Microsoft Research for the 84\u2033 Microsoft Surface Hub, a large electronic whiteboard supporting both pen and multi-touch input. WritLarge allows\u00a0creators to unleash the latent expressive power of ink in a compelling manner.\u00a0Using multitouch, the user can simply frame a portion of their \u201cwhiteboard\u201d session between thumb and forefinger, and […]<\/p>\n","protected":false},"featured_media":381239,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"footnotes":""},"research-area":[13554],"msr-video-type":[],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-381158","msr-video","type-msr-video","status-publish","has-post-thumbnail","hentry","msr-research-area-human-computer-interaction","msr-locale-en_us"],"msr_download_urls":"","msr_external_url":"https:\/\/youtu.be\/6lWe9PvabAo","msr_secondary_video_url":"","msr_video_file":"","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/381158"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-video"}],"version-history":[{"count":1,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/381158\/revisions"}],"predecessor-version":[{"id":496274,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/381158\/revisions\/496274"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/381239"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=381158"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=381158"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=381158"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=381158"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=381158"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=381158"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}