{"id":287426,"date":"2014-01-07T09:00:00","date_gmt":"2014-01-07T17:00:00","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=287426"},"modified":"2016-09-09T11:05:08","modified_gmt":"2016-09-09T18:05:08","slug":"new-spin-photosynth","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/new-spin-photosynth\/","title":{"rendered":"A New Spin for Photosynth"},"content":{"rendered":"

\u201cMakes HDTV look low-res.\u201d<\/p>\n

\u201cWow, how did they make this?\u201d<\/p>\n

\u201cHyper-detailed and actually looks 3-D.\u201d<\/p>\n

These are just some of the words reviewers (opens in new tab)<\/span><\/a> have used to describe the new Photosynth (opens in new tab)<\/span><\/a>. A technical preview, launched December 10, enables photography enthusiasts to create even more realistic 3-D views of objects and locations captured through still photos.<\/p>\n

A user can upload multiple images of the same object or physical space to the Photosynth cloud service, which then processes the images into a \u201csynth\u201d\u2014a composite of overlapping photographs that create a 3-D model of the space, with added depth and transitional images for smooth 3-D viewing.<\/p>\n

\"Photosynth

Photosynth turns ordinary photos into 3-D views that travel along or spin around an axis.<\/p><\/div>\n

Photosynth stitches together the photos to create spin, panorama, walk, or wall synths (opens in new tab)<\/span><\/a> that draw the viewer along a path or a spin around an axis. It\u2019s obvious there\u2019s sophisticated technology at work here. Less apparent is the extent and duration of the relationship between Microsoft Research\u2019s Interactive Visual Media (IVM) group and the Photosynth product team.<\/p>\n

It Began with Photo Tourism<\/h2>\n

\u201cThe first bit of ground-breaking work was back in 2006 with Photo Tourism (opens in new tab)<\/span><\/a>,\u201d says Eric Stollnitz (opens in new tab)<\/span><\/a>, IVM principal developer. \u201cIt began as a collaboration between Noah Snavely and Steve Seitz at the University of Washington and my colleague Richard Szeliski (opens in new tab)<\/span><\/a>. The goal was to take images of the same landmark from the web\u2014Paris\u2019 Notre-Dame or the Eiffel Tower, for example\u2014taken by different photographers and at different times, and combine them to produce a 3-D visualization of a landmark. It was a photo-crowdsourcing idea.\u201d<\/p>\n

A Microsoft product group refined Photo Tourism and, in 2008, launched Photosynth (opens in new tab)<\/span><\/a>, a desktop product that stitched together a user\u2019s collection of images into a 3-D model that could be uploaded to the Photosynth website for sharing.<\/p>\n

Photosynth was enhanced in 2010 with the incorporation of Image Composite Editor (opens in new tab)<\/span><\/a> (ICE), an advanced image stitcher that combines a set of overlapping photographs to create a seamless, 360-degree panorama at full resolution.<\/p>\n

\u201cThe Bing (opens in new tab)<\/span><\/a> team was really excited about this,\u201d Stollnitz recalls, \u201cand they added a panorama-viewing feature to the website, while the IVM team added the ability for ICE to export panoramas to the Photosynth desktop application, which meant Photosynth could upload your panorama to the website for sharing.\u201d<\/p>\n

In quick succession, Bing launched its Mobile Panoramas capture-and-display application for iOS devices in 2011 and added a Windows Phone (opens in new tab)<\/span><\/a> app and improved social sharing in 2012. And now, the new Photosynth delivers even more 3-D realism, thanks to innovative work on transitions and navigation.<\/p>\n

The Parallax View<\/h2>\n

Transitions are reconstructions of plausible views between actual photographs, synthesized to fill in gaps and provide smoother movement. As part of the IVM\u2019s Spin project, a 2009 paper titled Piecewise Planar Stereo for Image-based Rendering<\/em> (opens in new tab)<\/span><\/a>\u2014by researcher Sudipta N. Sinha (opens in new tab)<\/span><\/a>; Drew Steedly, principal development manager at Microsoft; and Szeliski\u2014proposed a way to create more realistic transitions when moving from photograph to photograph.<\/p>\n

\u201cThey decided to do something about synths that looked as though they\u2019d been projected onto a single plane,\u201d Stollnitz explains. \u201cEssentially, we were showing flat screen after flat screen joined together. But in real life, you\u2019d see objects from different angles and depths as you move along. We had to fill in more information gaps.\u201d<\/p>\n

The answer to more realistic transitions was to use computer-vision techniques to calculate the depth of each pixel. This required analyzing each pair of overlapping photographs and comparing objects that appeared in both images to determine how far they were from the camera.<\/p>\n

\"Photosynth\"

Photosynth\u2019s internal representation of this synth includes coarse 3-D surfaces culminating in a vanishing point at the end of the walk.<\/p><\/div>\n

\u201cObjects that shift just a little bit from one image to the next image are farther away,\u201d Stollnitz says, \u201cwhile objects that have shifted quite a bit from one image from to the next are closer in. It\u2019s all about parallax. Sudipta really focused on this challenge. He\u2019s a master at this \u2018depth from stereo\u2019 technique.\u201d<\/p>\n

With a depth calculation for every pixel, the researchers were able to simplify the information and construct 3-D surfaces from a relatively small number of planes. By projecting images onto these coarse 3-D approximations instead of a single plane, the team created transitions that are far more immersive, with different depths and angles.<\/p>\n

Spinning Through 3-D<\/h2>\n

Another enhancement the Spin project brought to Photosynth was simpler navigation. Depending on the number of images and their relationship to the physical space, there could be many potential paths through which to travel in a synth.<\/p>\n

\u201cThat means many 3-D relationships,\u201d Stollnitz says, \u201cwhich is ultimately very powerful, but it can also be overwhelming and confusing if you\u2019re able to rotate as well as move forward and back, left and right, up and down as you\u2019re making your way through a collection. Furthermore, in situations where photos are only loosely connected to each other, it becomes difficult aligning points in each image and handling the transition in a smooth manner if you try to accommodate every possible path.\u201d<\/p>\n

Instead, researcher Johannes Kopf (opens in new tab)<\/span><\/a> simplified navigation and, in doing so, also enhanced the 3-D experience. By restricting navigation to a circular path\u2014either an outward-looking \u201cpanorama\u201d from a fixed spot or a \u201cspin\u201d around an object\u2014transitions between photos became much smoother.<\/p>\n

\u201cThe computer-vision work for pixel depth was that much easier,\u201d Stollnitz says, \u201cand projections onto the 3-D surfaces looked really good. It was definitely a \u2018wow\u2019 moment when we saw how well these two approaches came together. The transitions in these synths offer a much more realistic sense of depth.\u201d<\/p>\n

The \u201cspin\u201d navigation and depth from stereo also impressed the Bing team, which immediately added the new technology to Photosynth and suggested two more synth scenarios: \u201cwalk,\u201d for images taken with the camera moving forward into a scene, and \u201cwall,\u201d in which the camera takes images perpendicular to the direction of movement.<\/p>\n

Another key decision was to host the new Photosynth in the cloud. Both the Bing and IVM teams felt the amount of processing demanded for 3-D spins would take too much time on smaller devices, so they re-engineered the technology to run on Microsoft Azure (opens in new tab)<\/span><\/a> and to handle processing for thousands of incoming photo collections, a feat Stollnitz considers a significant engineering accomplishment.<\/p>\n

\u201cPhotography\u2014particularly travel photography\u2014is one of my hobbies,\u201d he says. \u201cMy wife and I always come back from trips with tons of photos. Collaboration with the Photosynth product team has been particularly satisfying, not just because I get to indulge my passion for visual media, but also because we\u2019ve been able to amplify the ideas of the original research work and develop a robust, powerful, yet easy-to-use tool and viewing experience. And now, by releasing Photosynth as a cloud-based service, we\u2019ve made it possible for just about anybody to use our technology.\u201d<\/p>\n

Close, Ongoing Collaboration<\/h2>\n

Stollnitz has been the interface between the IVM and Photosynth teams, responsible for product contributions, making code reliable and usable not just for research purposes but also solid enough to provide to product developers.<\/p>\n

\u201cA number of researchers at IVM have been involved with Photosynth,\u201d Stollnitz says. \u201cRick with Photo Tourism and the first generation of Photosynth, for example, and Matt Uyttendaele (opens in new tab)<\/span><\/a> with ICE. They\u2019ve been great to have as technical advisers for this latest generation of Photosynth.<\/p>\n

\u201cOn the Photosynth product team, David Gedye (opens in new tab)<\/span><\/a> has been the lead program manager since the beginning, so we\u2019ve had that continuity of vision over a multiyear collaboration, which we feel has been very productive.\u201d<\/p>\n

Gedye feels the same.<\/p>\n

\u201cOver the years,\u201d he says, \u201cthe IVM group has moved beyond contributing just ideas and prototypes. Now, they\u2019re providing shipping-quality code and development support. Just as an example, with the new Photosynth, Sudipta contributed the core computer-vision algorithms, made important technical-design contributions, and came to every product meeting. He\u2019s been one of the driving forces behind this release. We feel very tightly coupled with the research team.\u201d<\/p>\n

The process culminating in the new Photosynth, Szeliski observes, has been a delight to behold.<\/p>\n

\u201cWhen you look back,\u201d he says, \u201cand watch how Photosynth capabilities have evolved, it\u2019s terrific how different members of the IVM have contributed a diverse set of research ideas, including Sudipta\u2019s foundational work in 3-D reconstruction, Johannes\u2019 work on image-based rendering and 3-D navigation, and Eric\u2019s work on user interfaces and cloud services. The Spin project, which produced all of these breakthrough features, also represents our closest working relationship yet with the Photosynth product team.\u201d<\/p>\n

The fruits of that partnership are now yours to enjoy. Sign up (opens in new tab)<\/span><\/a>, and you\u2019ll receive confirmation and access within 24 hours.<\/p>\n","protected":false},"excerpt":{"rendered":"

\u201cMakes HDTV look low-res.\u201d \u201cWow, how did they make this?\u201d \u201cHyper-detailed and actually looks 3-D.\u201d These are just some of the words reviewers have used to describe the new Photosynth. A technical preview, launched December 10, enables photography enthusiasts to create even more realistic 3-D views of objects and locations captured through still photos. A […]<\/p>\n","protected":false},"author":39507,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"categories":[205399,194471,194480],"tags":[212096,212081,194524,212105,186604,212099,212063,212102,212087,212108,187252,212111,212090,212093,203271,203273,212084,212114],"research-area":[13562,13551],"msr-region":[],"msr-event-type":[],"msr-locale":[268875],"msr-post-option":[],"msr-impact-theme":[],"msr-promo-type":[],"msr-podcast-series":[],"class_list":["post-287426","post","type-post","status-publish","format-standard","hentry","category-azure","category-computer-vision","category-graphics-and-multimedia","tag-3-d-model","tag-3-d-views","tag-3-d-visualization","tag-360-degree-panorama","tag-bing","tag-collection-of-images","tag-image-composite-editor-ice","tag-image-stitcher","tag-interactive-visual-media-ivm","tag-mobile-panoramas-mobile-app","tag-panorama","tag-parallax","tag-photo-tourism","tag-photo-crowdsourcing","tag-photography","tag-photosynth","tag-synth","tag-visual-media","msr-research-area-computer-vision","msr-research-area-graphics-and-multimedia","msr-locale-en_us"],"msr_event_details":{"start":"","end":"","location":""},"podcast_url":"","podcast_episode":"","msr_research_lab":[],"msr_impact_theme":[],"related-publications":[],"related-downloads":[],"related-videos":[],"related-academic-programs":[],"related-groups":[],"related-projects":[],"related-events":[],"related-researchers":[],"msr_type":"Post","byline":"","formattedDate":"January 7, 2014","formattedExcerpt":"\u201cMakes HDTV look low-res.\u201d \u201cWow, how did they make this?\u201d \u201cHyper-detailed and actually looks 3-D.\u201d These are just some of the words reviewers have used to describe the new Photosynth. A technical preview, launched December 10, enables photography enthusiasts to create even more realistic 3-D…","locale":{"slug":"en_us","name":"English","native":"","english":"English"},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/287426"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/users\/39507"}],"replies":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/comments?post=287426"}],"version-history":[{"count":6,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/287426\/revisions"}],"predecessor-version":[{"id":290876,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/287426\/revisions\/290876"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=287426"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/categories?post=287426"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/tags?post=287426"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=287426"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=287426"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=287426"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=287426"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=287426"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=287426"},{"taxonomy":"msr-promo-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-promo-type?post=287426"},{"taxonomy":"msr-podcast-series","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-podcast-series?post=287426"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}