{"id":345770,"date":"2017-01-03T13:30:53","date_gmt":"2017-01-03T21:30:53","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&p=345770"},"modified":"2018-10-16T22:10:12","modified_gmt":"2018-10-17T05:10:12","slug":"parameterized-environment-maps","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/parameterized-environment-maps\/","title":{"rendered":"Parameterized Environment Maps"},"content":{"rendered":"
Static environment maps fail to capture local reflections including effects like self reflections and parallax in the reflected imagery. We instead propose parameterized environment maps (PEMs), a set of per-view environment maps which accurately reproduce local reflections at each viewpoint as computed by an offline ray tracer. Even with a small set of viewpoint samples, PEMs support plausible movement away from and between the pre-rendered viewpoint samples while maintaining local reflections. They also make use of environment maps supported in graphics hardware to provide real-time exploration of the pre-rendered space. In addition to parameterization by viewpoint, our notion of PEMextends to general, multidimensional parameterizations of the scene, including relative motions of objects and lighting changes. Our contributions include a technique for inferring environment maps providing a close match to ray-traced imagery. We also explicitly infer and encode all MIPMAP levels of the PEMs to achieve higher accuracy. We propose layered environment maps that separate local and distant reflected geometry. We explore several types of environment maps including finite spheres, ellipsoids, and boxes that better approximate the environmental geometry. We demonstrate results showing faithful local reflections in an interactive viewer.<\/p>\n