#### Reference

#### Abstract

Differential performance debugging is a technique to find performance
problems. It applies in situations where the performance of a program is
(unexpectedly) different for different classes of inputs. The task is to
explain the differences in asymptotic performance among various input
classes in terms of program internals. We propose a data-driven
technique based on *discriminant regression tree* (DRT) learning
problem where the goal is to discriminate among different classes of
inputs. We propose a new algorithm for DRT learning that first clusters
the data into functional clusters, capturing different asymptotic
performance classes, and then invokes off-the-shelf decision tree
learning algorithms to explain these clusters. We focus on linear
functional clusters and adapt classical clustering algorithms
(*K*-means and spectral) to produce them. For the
*K*-means algorithm, we generalize the notion of the cluster
centroid from a point to a linear function. We adapt spectral clustering
by defining a novel kernel function to capture the notion of "linear"
similarity between two data points. We evaluate our approach on
benchmarks consisting of Java programs where we are interested in
debugging performance. We show that our algorithm significantly
outperforms other well-known regression tree learning algorithms in
terms of running time and accuracy of classification.

#### BibTeX

@string{AAAI = "AAAI Conference on Artificial Intelligence (AAAI)"} @inproceedings{dpdebugger-aaai18, author = {Saeid Tizpaz Niari and Pavol Černý and Bor-Yuh Evan Chang and Ashutosh Trivedi}, title = {Differential Performance Debugging with Discriminant Regression Trees}, booktitle = AAAI, year = {2018}, pages = {2468-2475}, }