on ‎2020 Sep 09 11:20 AM
I'm trying to detect outliers of a time series that contain seasonality.
I'm using the industry standard of 2.5, where the system checks whether the historical values deviate from the mean by more than the standard deviation multiplied by a constant, but it confuses outliers with high/low peaks (of seasonality).
It would be very helpful if someone could share a solution for this problem?
Request clarification before answering.
| User | Count |
|---|---|
| 13 | |
| 7 | |
| 5 | |
| 3 | |
| 1 | |
| 1 | |
| 1 | |
| 1 | |
| 1 | |
| 1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.