Escapement: A Tool for Interactive Prototyping with Video via Sensor-Mediated Abstraction of Time

CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems |

Published by ACM

DOI

We present Escapement, a video prototyping tool that introduces a powerful new concept for prototyping screen-based interfaces by flexibly mapping sensor values to dynamic playback control of videos. This recasts the time dimension of video mock-ups as sensor-mediated interaction.

This abstraction of time as interaction, which we dub video-escapement prototyping, empowers designers to rapidly explore and viscerally experience direct touch or sensor-mediated interactions across one or more device displays. Our system affords cross-device and bidirectional remote (tele-present) experiences via cloud-based state sharing across multiple devices. This makes Escapement especially potent for exploring multi-device, dual-screen, or remote-work interactions for screen-based applications.

We introduce the core concept of sensor-mediated abstraction of time for quickly generating video-based interactive prototypes of screen-based applications, share the results of observations of long-term usage of video-escapement techniques with experienced interaction designers, and articulate design choices for supporting a reflective, iterative, and open-ended creative design process.

Figure 1: On the left, a person tilts a hand-held tablet forward - this gesture is labeled “Interactive prototyping with video.” On the right, images show sequential steps through a video sequence. Each frame lies along a timeline. A single frame in the timeline is labeled as the “timeframe index”. The timeline is normalized from 0 to 1. Boxes on the right show options for the sensor input (tilt, rotation, touch, location, angle, proximity), which lead to the mapping of this sensor input to timeframe index. Icons show other manipulations for sensor data: filtering, or transfer functions.

Escapement is a prototyping tool that reifies video snippets as sensor-mediated interactive prototypes, for screen-based applications, and across one or more devices. The tool flexibly maps a variety of real-time sensor inputs (such as tilt, motion, or touch) to the time index of a pre-recorded video (or series of still images). This empowers designers to work directly with “time as a design material” in the prototyping process, exploring the feel of an interaction in response to sensor data and corresponding visual feedback.

A table showing parameters (ways to map video time to sensor data) and strategies (prototyping strategies). The parameters are: manual, touch, tilt, angle motion, gyroscope, and generic sensor. The strategies are: Using time as a design material, using video to shape the design, experiencing embodied explorations, exploring many sensors and degrees of freedom, tweening animations, and decoupling input and output.

Parameters & Strategies with Escapement: The tool provides designers access to various parameters of control, which allows for the manipulation of video with time abstracted out. Throughout the years the tool has been in use, we identified prototyping strategies in use by designers as they create video-escapement prototypes.

 

Escapement: A Tool for Interactive Prototyping with Video via Sensor-Mediated Abstraction of Time

This CHI 2023 research video shows the Escapement interactive video-prototyping tool in action. Escapement is a video prototyping tool that introduces a powerful new concept for prototyping screen-based interfaces by flexibly mapping sensor values to dynamic playback control of videos. This recasts the time dimension of video mock-ups as sensor-mediated interaction. This abstraction of time as interaction, which we dub video-escapement prototyping, empowers designers to rapidly explore and viscerally experience direct touch or sensor-mediated interactions across one or more device displays. Our system affords cross-device and bidirectional remote (tele-present) experiences via cloud-based state sharing across multiple devices. This makes Escapement especially potent for exploring multi-device, dual-screen, or remote-work interactions for screen-based applications. We introduce the core concept of sensor-mediated abstraction of time for quickly generating…