CUMSUM for change-point detection in time series
Method and qualitative analyses on basic scenarios
A change point in a time series is a time point at which the time series undergoes a significant change in its statistical properties. The term “change point” typically refers to a change that is durable. By contrast, the term “anomaly” refers to one that is transient.
In [1], we introduced change-point detection in time series, covered various basic scenarios with illustrations, then discussed some basic methods tuned to those scenarios.
In this post we cover and illustrate the CUMSUM method — a simple method for detecting changes in mean retroactively. It works surprisingly well.
CUSUM Method
This method first computes the mean value in the time series. It next subtracts this mean from each of the values in the time series. The resulting time series is thus centered around 0.
Next, it computes a time series of cumulative sums of the values in the centered time series. Finally, it detects points in the cumsum time series in which its value changes abruptly. It flags these as the change points.
We illustrate this method in the figure below. To keep this illustration simple, we chose a time series whose mean is already 0. (One less time series to plot.)