user1940350
user1940350

Reputation: 173

Fault Detection on time sequence of variable changing (trending) over the time

I am pretty new on anomaly detection on time sequence so my question can be obvious for some of you. Today, I am using lstm and clustering techniques to detect anomalies on time sequences but those method can not identify anomalies that get worse slowly over the time (i think it called trending), i.e temprature of machine increase slowly over the month (lstm will learn this trend and predict the increase without any special error). There is such a method to detect this kind of faluts?

Upvotes: 0

Views: 141

Answers (1)

Has QUIT--Anony-Mousse
Has QUIT--Anony-Mousse

Reputation: 77454

With time series that is usually what you want: learning gradual change, detecting abrupt change. Otherwise, time plays little role.

You can try e.g. the SigniTrend model with a very slow learning rate (a long half-life time or whatever they called it. Ignore all the tokens, hashing and scalability in that paper, only get the EWMA+EWMVar part which I really like and use it on your time series).

If you set the learning rate really low, the threshold should move slow enough so that your "gradual" change may still be able to trigger them.

Or you ignore time completely. Split your data into a training set (that must not contain anomalies), learn mean and variance on that to find thresholds. Then classify any point outside these thresholds as abnormal (I.e. temperature > mean + 3 * standarddeviation). As this super naive approach does not learn, it will not follow a drift either. But then time does not play any further role.

Upvotes: 1

Related Questions