Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'softmax' should not be applied to compute the 'metric' in lines 319 and 280 #66

Open
guanwei49 opened this issue May 16, 2024 · 0 comments

Comments

@guanwei49
Copy link

This method achieves excellent performance primarily due to the utilization of 'detection adjustment' and 'softmax'. In 'solver.py', the author employs 'softmax' to compute the 'metric' in lines 319 and 280. This results in a high value close to 1 for each window, thereby causing each window to contain at least one timestamp with a notably large anomaly score compared to others within the same window. Consequently, within the 'pred' output, each window is likely to be flagged as containing at least one anomaly. Subsequently, when 'detection adjustment' is applied, the entire continuous anomaly sequence is labeled as anomalous. However, 'softmax' is not suitable in this context because it cannot effectively model the relationship between different timestamps within a window. Removing 'softmax' from lines 319 and 280 would lead to a significant decrease in performance.

As an alternative example, consider a scenario where I designate one timestamp as an anomaly every 100 timestamps and apply 'detection adjustment'. Surprisingly, despite this simplistic approach, I still obtain highly satisfactory results.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant