Software: Design and Performance

This talk consists of two parts. In the software design part we introduce a software metric that quantifies the essence of a software system. In the software performance part we present an approach to transparently capture perceptible performance bugs in real-world deployed interactive applications.

In the first part of this talk we present an approach that partitions a software system into its algorithmically essential parts and the parts that manifest its design. Our approach is inspired by the notion of an algorithm and its asymptotic complexity. However, we do not propose a metric for measuring asymptotic complexity (efficiency). Instead, we use the one aspect of algorithms that drives up their asymptotic complexity – repetition, in the form of loops and recursions – to determine the algorithmically essential parts of a software system. Those parts of a system that are not algorithmically essential represent aspects of the design. A large fraction of inessential parts is indicative of “overdesign”, while a small fraction indicates a lack of modularization. We present a metric, relative essence, to quantify the fraction of the program that is algorithmically essential. We evaluate our approach by studying the algorithmic essence of a large corpus of software systems, and by comparing the measured essence to an intuitive view of design “overhead”.

In the second part of the talk we present our approach to capture perceptible performance bugs in deployed interactive applications. Our approach combines instrumentation and sampling to gather compact, actionable profiles of perceptible latency bugs. Our tool, Lag Hunter, employs a diverse set of optimizations to reduce the profiling overhead to a level that allows its deployment with production applications in the field. This enables the observation of application performance in the contexts in which applications are actually used, and it allows the collection of profiles that are representative of real-user behavior. We evaluate our approach by deploying a Lag Hunter-enabled version of Eclipse, one of the largest interactive Java applications, over the course of three months, collecting profiles that represent over 1900 hours of usage by 24 different users. We then characterize the perceptible performance issues Lag Hunter identified in these sessions, and we find and fix the causes of a set of representative performance bugs.

Speaker Details

Matthias Hauswirth is an assistant professor at the Faculty of Informatics at the University of Lugano (Switzerland), where he leads the Software and Programmer Efficiency (Sape) research group. He is interested in performance measurement, understanding, and optimization. Matthias received his PhD from the University of Colorado at Boulder. http://www.inf.usi.ch/faculty/hauswirth – http://sape.inf.usi.ch/

Date:
Speakers:
Matthias Hauswirth
Affiliation:
University of Lugano