Objective Revision Evaluation Service/Issues
Appearance
This page is kept for historical interest. Any policies mentioned may be obsolete. If you want to revive the topic, you can use the talk page or start a discussion on the community forum. |
ORES makes mistakes. It's not as smart as humans are. In order to make sure ORES is as useful as possible despite its limitation, documentation around its issues and mistakes is critical. This page is intended to be a central reference for reporting and discussing issues with ORES.
Edit quality
[edit]- See /Edit_quality for reporting misclassifications.
Bias against anonymous editors
[edit]- T118982 -- hewiki "reverted" model weights strongly against anons
- T129624 -- Investigate nlwiki 'reverted' model seems broken (always ~0.89 for anonymous edits)
- T120138 -- [Epic] Explore disparate impacts of damage detection and goodfaith prediction on anons and newcomers.
- Research talk:Automated classification of edit quality/Work log/2016-04-14 -- Analysis of bias against anons when switching in GradientBoosting model
- Presentation by EpochFail that discusses mitigating bias against anons (slides)
We're working to mitigate the problem by increasing signal from non-user-related features.