CUMSUM for change-point detection in time series

Method and qualitative analyses on basic scenarios

Arun Jagota
5 min readDec 26, 2023

A change point in a time series is a time point at which the time series undergoes a significant change in its statistical properties. The term “change point” typically refers to a change that is durable. By contrast, the term “anomaly” refers to one that is transient.

In [1], we introduced change-point detection in time series, covered various basic scenarios with illustrations, then discussed some basic methods tuned to those scenarios.

In this post we cover and illustrate the CUMSUM method — a simple method for detecting changes in mean retroactively. It works surprisingly well.

CUSUM Method

This method first computes the mean value in the time series. It next subtracts this mean from each of the values in the time series. The resulting time series is thus centered around 0.

Next, it computes a time series of cumulative sums of the values in the centered time series. Finally, it detects points in the cumsum time series in which its value changes abruptly. It flags these as the change points.

We illustrate this method in the figure below. To keep this illustration simple, we chose a time series whose mean is already 0. (One less time series to plot.)

Figure 2: A time series with a mean of 0 (in blue), its cumsum in orange, a change point in the cumsum (red point). (Ignore the red dashed vertical line.)

--

--

Arun Jagota

PhD, Computer Science, neural nets. 14+ years in industry: data science algos developer. 24+ patents issued. 50 academic pubs. Blogs on ML/data science topics.