{"id":376952,"date":"2017-04-11T04:01:34","date_gmt":"2017-04-11T11:01:34","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&p=376952"},"modified":"2018-10-16T21:59:14","modified_gmt":"2018-10-17T04:59:14","slug":"presenting-physical-things-digitally-new-collecting-practices","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/presenting-physical-things-digitally-new-collecting-practices\/","title":{"rendered":"Presenting Physical Things Digitally: New Collecting Practices"},"content":{"rendered":"

The motivations for collecting and the idiosyncrasies of physical and digital collections have been long studied. However, how they are presented in the digital space is an unresolved challenge. To help better understand this problem from a design perspective, we built Thinga.Me. Thinga.Me is a system which allows users to capture photographs of physical objects and then cut them out, place them into digital collections, and share them. By segmenting the object from the background the interface creates the illusion of a physical item, giving a sense of carrying your stuff with you in your pocket. Following two years of development, iteration and feedback, we discuss uses of the app and the implications it can have for changing the way we reflect on physical things in our lives. In particular, we focus on how digital collection are presented and displayed in a realistic way as a way of providing more meaning and helping shape users\u2019 identities. Demonstrating the importance of visual design choices, our results lead to considerations on how to most appropriately display physical objects in the virtual world, whilst avoiding the uncanniness some might experience when interacting with skeuomorphic collections.<\/p>\n","protected":false},"excerpt":{"rendered":"

The motivations for collecting and the idiosyncrasies of physical and digital collections have been long studied. However, how they are presented in the digital space is an unresolved challenge. To help better understand this problem from a design perspective, we built Thinga.Me. Thinga.Me is a system which allows users to capture photographs of physical objects […]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"msr-content-type":[3],"msr-research-highlight":[],"research-area":[13554],"msr-publication-type":[193716],"msr-product-type":[],"msr-focus-area":[],"msr-platform":[],"msr-download-source":[],"msr-locale":[268875],"msr-post-option":[],"msr-field-of-study":[],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-376952","msr-research-item","type-msr-research-item","status-publish","hentry","msr-research-area-human-computer-interaction","msr-locale-en_us"],"msr_publishername":"","msr_edition":"Research Through Design 2017","msr_affiliation":"","msr_published_date":"2017-03-22","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"","msr_pages_string":"","msr_chapter":"","msr_isbn":"","msr_journal":"","msr_volume":"","msr_number":"","msr_editors":"","msr_series":"","msr_issue":"","msr_organization":"","msr_how_published":"","msr_notes":"","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":1,"msr_main_download":"376955","msr_publicationurl":"http:\/\/researchthroughdesign.org\/2017\/","msr_doi":"","msr_publication_uploader":[{"type":"file","title":"Harrison_RTD2017_ThingaMe","viewUrl":"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/Harrison_RTD2017_ThingaMe.pdf","id":376955,"label_id":0},{"type":"url","title":"http:\/\/researchthroughdesign.org\/2017\/","viewUrl":false,"id":false,"label_id":0}],"msr_related_uploader":"","msr_attachments":[{"id":0,"url":"http:\/\/researchthroughdesign.org\/2017\/"}],"msr-author-ordering":[{"type":"text","value":"Daniel Harrison","user_id":0,"rest_url":false},{"type":"user_nicename","value":"rbanks","user_id":33361,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=rbanks"},{"type":"user_nicename","value":"timregan","user_id":34053,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=timregan"},{"type":"user_nicename","value":"mgrayson","user_id":32893,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=mgrayson"}],"msr_impact_theme":[],"msr_research_lab":[],"msr_event":[],"msr_group":[],"msr_project":[563061,549501],"publication":[],"video":[],"download":[],"msr_publication_type":"inproceedings","related_content":{"projects":[{"ID":563061,"post_title":"Conversational Agents that See","post_name":"conversational-agents-that-see","post_type":"msr-project","post_date":"2019-03-28 03:37:19","post_modified":"2022-09-06 10:01:24","post_status":"publish","permalink":"https:\/\/www.microsoft.com\/en-us\/research\/project\/conversational-agents-that-see\/","post_excerpt":"In this project we explore what it might mean to augment digital agents such as Microsoft Cortana with the ability to see.\u00a0By partnering with experts in computer vision, speech and machine learning, we ask whether the ability for an agent to see a person\u2019s activities and context might make its capabilities more effective.","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/563061"}]}},{"ID":549501,"post_title":"Thinga.Me","post_name":"thinga-me","post_type":"msr-project","post_date":"2018-11-09 08:45:33","post_modified":"2021-04-22 08:46:41","post_status":"publish","permalink":"https:\/\/www.microsoft.com\/en-us\/research\/project\/thinga-me\/","post_excerpt":"Featuring \"GrabCut\" segmentation technology, developed at Microsoft Research Cambridge, Thinga.Me was an iPhone app that allowed people to take photos of the important physical items in their lives, quickly cut them out, and arrange them in rich digital collections which they could share with others.","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/549501"}]}}]},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/376952"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":2,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/376952\/revisions"}],"predecessor-version":[{"id":540924,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/376952\/revisions\/540924"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=376952"}],"wp:term":[{"taxonomy":"msr-content-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-content-type?post=376952"},{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=376952"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=376952"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=376952"},{"taxonomy":"msr-product-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-product-type?post=376952"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=376952"},{"taxonomy":"msr-platform","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-platform?post=376952"},{"taxonomy":"msr-download-source","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-download-source?post=376952"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=376952"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=376952"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=376952"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=376952"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=376952"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=376952"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=376952"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}