Download | - View final version: Necessity and sufficiency for explaining text classifiers: a case study in hate speech detection (PDF, 407 KiB)
|
---|
DOI | Resolve DOI: https://doi.org/10.18653/v1/2022.naacl-main.192 |
---|
Author | Search for: Balkir, Esma1; Search for: Nejadgholi, Isar1; Search for: Fraser, Kathleen1; Search for: Kiritchenko, Svetlana1 |
---|
Affiliation | - National Research Council of Canada. Digital Technologies
|
---|
Format | Text, Article |
---|
Conference | Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL 2022), July 10-15, 2022, Seattle, Washington (Hybrid Conference) |
---|
Abstract | We present a novel feature attribution method for explaining text classifiers, and analyze it in the context of hate speech detection. Although feature attribution models usually provide a single importance score for each token, we instead provide two complementary and theoretically-grounded scores – necessity and sufficiency – resulting in more informative explanations. We propose a transparent method that calculates these values by generating explicit perturbations of the input text, allowing the importance scores themselves to be explainable. We employ our method to explain the predictions of different hate speech detection models on the same set of curated examples from a test suite, and show that different values of necessity and sufficiency for identity terms correspond to different kinds of false positive errors, exposing sources of classifier bias against marginalized groups. |
---|
Publication date | 2022-07-15 |
---|
Publisher | Association for Computational Linguistics (ACL) |
---|
Licence | |
---|
In | |
---|
Language | English |
---|
Peer reviewed | Yes |
---|
Export citation | Export as RIS |
---|
Report a correction | Report a correction (opens in a new tab) |
---|
Record identifier | 9c698f67-b648-4edb-866a-4106252f0b1b |
---|
Record created | 2022-09-09 |
---|
Record modified | 2022-09-12 |
---|