This leaderboard shows the best score for each task, as well as the model name and the submission title.
Best Submission Information | ||||
Task Name | Identifier | Submission Title | Model Name | Score |
Aggressive-hin-Kumar | aggressive-2018-kumar-hin | Finetuned LMs | XLM-RoBERTa-Large | 75.7885 |
Dangerous-ara-Alshehri | dangerous-2020-alshehri-ara | Finetuned LMs | XLM-Twitter | 67.1497 |
Emotion-ara-Abdul | emotion-2020-abdul-ara | Finetuned LMs | Bernice | 66.2361 |
Emotion-ara-Mohammad | emotion-2018-mohammad-ara | Finetuned LMs | XLM-RoBERTa-Large | 85.0011 |
Emotion-ben-Iqbal | emotion-2022-iqbal-ben | Finetuned LMs | XLM-RoBERTa-Large | 65.7731 |
Emotion-deu-Bianchi | emotion-2022-bianchi-deu | Finetuned LMs | XLM-RoBERTa-Large | 84.8025 |
Emotion-eng-Demszky | emotion-2020-demszky-eng | Finetuned LMs | InfoDCL | 58.8281 |
Emotion-eng-Mohammad | emotion-2018-mohammad-eng | Finetuned LMs | TwHIN-BERT | 81.4713 |
Emotion-eng-Plaza | emotion-2020-plaza-eng | Finetuned LMs | XLM-RoBERTa-Large | 51.0465 |
Emotion-eng-Wallbott | emotion-1986-wallbott-eng | Finetuned LMs | XLM-RoBERTa-Large | 74.4317 |
Emotion-fas-Sabri | emotion-2021-sabri-fas | Zero-shot | Chatgpt | 28.5941 |
Emotion-fin-Kajava | emotion-2018-kajava-fin | Finetuned LMs | XLM-RoBERTa-Large | 63.2954 |
Emotion-fin-Ohman | emotion-2020-ohman-fin | Finetuned LMs | XLM-RoBERTa-Large | 59.3721 |
Emotion-fre-Bianchi | emotion-2022-bianchi-fre | Finetuned LMs | Bernice | 84.1453 |
Emotion-fre-Kajava | emotion-2018-kajava-fre | Finetuned LMs | XLM-RoBERTa-Large | 63.1902 |
Emotion-hin-Debaditya | emotion-2021-debaditya-hin | Finetuned LMs | XLM-RoBERTa-Large | 53.3858 |
Emotion-ind-Saputri | emotion-2019-saputri-ind | Finetuned LMs | XLM-RoBERTa-Large | 81.4263 |
Emotion-ind-Wilie | emotion-2020-wilie-ind | Finetuned LMs | XLM-RoBERTa-Large | 77.8397 |
Emotion-ita-Bianchi | emotion-2021-bianchi-ita | Finetuned LMs | Bernice | 76.7783 |
Emotion-ita-Kajava | emotion-2018-kajava-ita | Finetuned LMs | XLM-RoBERTa-Large | 64.2511 |
Emotion-por-Cortiz | emotion-2021-cortiz-por | Finetuned LMs | XLM-Twitter | 77.2545 |
Emotion-ron-Ciobotaru | emotion-2021-ciobotaru-ron | Finetuned LMs | XLM-RoBERTa-Large | 92.9088 |
Emotion-rus-Sboev | emotion-2021-sboev-rus | Finetuned LMs | XLM-RoBERTa-Large | 86.1962 |
Emotion-spa-Mohammad | emotion-2018-mohammad-spa | Finetuned LMs | InfoDCL | 85.4896 |
Emotion-spa-Plaza | emotion-2020-plaza-spa | Finetuned LMs | Bernice | 57.9461 |
Emotion-tur-Guven | emotion-2020-guven-tur | Finetuned LMs | InfoDCL | 99.3333 |
Emotion-vie-Ho | emotion-2019-ho-vie | Finetuned LMs | XLM-RoBERTa-Large | 65.9179 |
Emotion-zho-Lee | emotion-2015-lee-zho | Finetuned LMs | XLM-RoBERTa-Large | 74.0032 |
Hate-ara-Alakrot | hate-2018-alakrot-ara | Finetuned LMs | InfoDCL | 85.9529 |
Hate-ara-Mubarak | hate-2020-mubarak-ara | Finetuned LMs | XLM-RoBERTa-Large | 83.7121 |
Hate-ara-Mulki | hate-2019-mulki-ara | Finetuned LMs | Bernice | 76.5326 |
Hate-eng-Basile | hate-2019-basile-eng | Zero-shot | Chatgpt | 63.6883 |
Hate-eng-Basile | hate-2019-basile-eng | Zero-shot | Chatgpt with translated prompts | 63.6883 |
Hate-eng-Davidson | hate-2017-davidson-eng | Finetuned LMs | XLM-RoBERTa-Base | 92.9558 |
Hate-eng-Waseem | hate-2016-waseem-eng | Finetuned LMs | XLM-RoBERTa-Large | 89.7468 |
Hate-fil-Cabasag | hate-2019-cabasag-fil | Finetuned LMs | TwHIN-BERT | 80.2842 |
Hate-Group-ara-Ousidhoum | hate-group-2019-ousidhoum-ara | Finetuned LMs | XLM-RoBERTa-Large | 58.3654 |
Hate-Group-fre-Ousidhoum | hate-group-2019-ousidhoum-fre | Finetuned LMs | XLM-RoBERTa-Large | 40.5495 |
Hate-ita-Bosco | hate-2018-bosco-ita | Finetuned LMs | XLM-RoBERTa-Large | 81.1843 |
Hate-kor-Jeong | hate-2022-jeong-kor | Finetuned LMs | XLM-RoBERTa-Large | 81.6544 |
Hate-kor-Moon | hate-2020-moon-kor | Finetuned LMs | XLM-RoBERTa-Large | 65.4247 |
Hate-pol-Ptaszynski | hate-2019-ptaszynski-pol | Zero-shot | Chatgpt | 77.0219 |
Hate-por-Fortuna | hate-2019-fortuna-por | Finetuned LMs | Bernice | 74.4002 |
Hate-spa-Basile | hate-2019-basile-spa | Finetuned LMs | Bernice | 78.2002 |
Hate-Target-ara-Ousidhoum | hate-target-2019-ousidhoum-ara | Finetuned LMs | XLM-RoBERTa-Large | 56.8406 |
Hate-Target-ben-Karim | hate-target-2020-karim-ben | Finetuned LMs | InfoDCL | 86.3043 |
Hate-Target-fre-Ousidhoum | hate-target-2019-ousidhoum-fre | Finetuned LMs | InfoDCL | 48.1783 |
Hate-Target-kor-Jeong | hate-target-2022-jeong-kor | Finetuned LMs | XLM-RoBERTa-Large | 68.4805 |
Hate-tel-Marreddy | hate-2022-marreddy-tel | Finetuned LMs | Bernice | 58.2163 |
Hate-zho-Deng | hate-2022-deng-zho | Finetuned LMs | InfoDCL | 84.8631 |
Humor-eng-Meaney | humor-2021-meaney-eng | Finetuned LMs | Bernice | 93.4065 |
Humor-hin-Aggarwal | humor-2020-aggarwal-hin | Finetuned LMs | InfoDCL | 80.4 |
Humor-rus-Blinov | humor-2019-blinov-rus | Finetuned LMs | XLM-RoBERTa-Large | 89.7956 |
Humor-spa-Chiruzzo | humor-2021-chiruzzo-spa | Finetuned LMs | InfoDCL | 87.664 |
Irony-ara-Ghanem | irony-2019-ghanem-ara | Finetuned LMs | XLM-RoBERTa-Base | 83.9505 |
Irony-eng-Hee | irony-2018-hee-eng | Finetuned LMs | InfoDCL | 68.2543 |
Irony-fas-Golazizian | irony-2020-golazizian-fas | Finetuned LMs | InfoDCL | 76.5306 |
Irony-hin-Vijay | irony-2018-vijay-hin | Finetuned LMs | XLM-RoBERTa-Base | 73.2472 |
Irony-ita-Basile | irony-2014-basile-ita | Finetuned LMs | XLM-RoBERTa-Large | 67.3033 |
Irony-ita-Cignarella | irony-2018-cignarella-ita | Finetuned LMs | XLM-RoBERTa-Large | 79.2685 |
Irony-spa-Barbieri | irony-2016-barbieri-spa | Finetuned LMs | InfoDCL | 67.5232 |
Irony-spa-Ortega | irony-2019-ortega-spa | Finetuned LMs | InfoDCL | 73.8797 |
Irony-Type-eng-Hee | irony-type-2018-hee-eng | Finetuned LMs | InfoDCL | 57.5832 |
Irony-zho-Xiang | irony-2020-xiang-zho | Finetuned LMs | InfoDCL | 33.3641 |
Offense-Target-kan-Chakravarthi | offense-target-2022-chakravart | Finetuned LMs | InfoDCL | 81.5283 |
Offense-Target-mal-Chakravarthi | offense-target-2022-chakravart | Finetuned LMs | InfoDCL | 81.5283 |
Offense-Target-tam-Chakravarthi | offense-target-2022-chakravart | Finetuned LMs | InfoDCL | 81.5283 |
Offensive-ara-Mubarak | offensive-2020-mubarak-ara | Finetuned LMs | Bernice | 93.1568 |
Offensive-ara-Zampieri | offensive-2020-zampieri-ara | Finetuned LMs | Bernice | 91.5521 |
Offensive-dan-Zampieri | offensive-2020-zampieri-dan | Finetuned LMs | InfoDCL | 82.0941 |
Offensive-ell-Zampieri | offensive-2020-zampieri-ell | Finetuned LMs | mBERT | 80.6361 |
Offensive-eng-Zampieri | offensive-2019-zampieri-eng | Finetuned LMs | InfoDCL | 78.6678 |
Offensive-Group-eng-Zampieri | offensive-group-2019-zampieri- | Finetuned LMs | Bernice | 61.9324 |
Offensive-slv-Novak | offensive-2021-novak-slv | Finetuned LMs | mBERT | 63.2293 |
Offensive-Target-eng-Zampieri | offensive-target-2019-zampieri | Finetuned LMs | Bernice | 78.1616 |
Offensive-tur-Zampieri | offensive-2020-zampieri-tur | Finetuned LMs | TwHIN-BERT | 79.5932 |
Sarcasm-ara-Abufarha | sarcasm-2020-abufarha-ara | Finetuned LMs | XLM-RoBERTa-Large | 75.575 |
Sarcasm-ara-Farha | sarcasm-2021-farha-ara | Finetuned LMs | XLM-RoBERTa-Large | 70.8533 |
Sarcasm-ces-Ptacek | sarcasm-2014-ptacek-ces | Finetuned LMs | XLM-RoBERTa-Large | 70.4269 |
Sarcasm-eng-Bamman | sarcasm-2015-bamman-eng | Finetuned LMs | Bernice | 82.4 |
Sarcasm-eng-Oraby | sarcasm-2016-oraby-eng | Finetuned LMs | XLM-RoBERTa-Large | 77.0769 |
Sarcasm-eng-Ptacek | sarcasm-2014-ptacek-eng | Finetuned LMs | XLM-RoBERTa-Base | 95.7637 |
Sarcasm-eng-Rajadesingan | sarcasm-2015-rajadesingan-eng | Finetuned LMs | Bernice | 96.2667 |
Sarcasm-eng-Riloff | sarcasm-2013-riloff-eng | Finetuned LMs | InfoDCL | 57.4589 |
Sarcasm-eng-Walker | sarcasm-2012-walker-eng | Zero-shot | Chatgpt | 70.7925 |
Sarcasm-eng-Walker | sarcasm-2012-walker-eng | Zero-shot | Chatgpt with translated prompts | 70.7925 |
Sarcasm-zho-Gong | sarcasm-2020-gong-zho | Finetuned LMs | XLM-RoBERTa-Large | 74.7643 |
Sentiment-5-eng-Socher | sentiment-5-2013-socher-eng | Finetuned LMs | XLM-RoBERTa-Large | 54.8 |
Sentiment-ace-Winata | sentiment-2022-winata-ace | Finetuned LMs | InfoDCL | 77.3559 |
Sentiment-amh-Muhammad | sentiment-2022-muhammad-amh | Finetuned LMs | TwHIN-BERT | 66.6 |
Sentiment-ara-Abdul | sentiment-2021-abdul-ara | Finetuned LMs | TwHIN-BERT | 77.6051 |
Sentiment-ara-Muhammad | sentiment-2022-muhammad-ara | Finetuned LMs | InfoDCL | 62.2 |
Sentiment-arq-Muhammad | sentiment-2022-muhammad-arq | Finetuned LMs | InfoDCL | 71.2518 |
Sentiment-ary-Muhammad | sentiment-2022-muhammad-ary | Finetuned LMs | InfoDCL | 53.4436 |
Sentiment-bam-Diallo | sentiment-2021-diallo-bam | Finetuned LMs | InfoDCL | 65.5674 |
Sentiment-ban-Winata | sentiment-2022-winata-ban | Finetuned LMs | XLM-Twitter | 79.7713 |
Sentiment-bbc-Winata | sentiment-2022-winata-bbc | Finetuned LMs | Bernice | 75.1651 |
Sentiment-ben-Islam | sentiment-2021-islam-ben | Finetuned LMs | XLM-RoBERTa-Large | 69.4916 |
Sentiment-ben-Patra | sentiment-2018-patra-ben | Finetuned LMs | XLM-RoBERTa-Large | 62.5481 |
Sentiment-bjn-Winata | sentiment-2022-winata-bjn | Finetuned LMs | InfoDCL | 84.4963 |
Sentiment-bos-Mozetic | sentiment-2016-mozetic-bos | Finetuned LMs | XLM-RoBERTa-Large | 71.9626 |
Sentiment-bug-Winata | sentiment-2022-winata-bug | Finetuned LMs | mBERT | 74.5701 |
Sentiment-bul-Mozetic | sentiment-2016-mozetic-bul | Finetuned LMs | XLM-RoBERTa-Large | 67.973 |
Sentiment-deu-Mozetic | sentiment-2016-mozetic-deu | Finetuned LMs | XLM-RoBERTa-Large | 64.2712 |
Sentiment-deu-Rei | sentiment-2016-rei-deu | Finetuned LMs | InfoDCL | 60.9438 |
Sentiment-eng-Mozetic | sentiment-2016-mozetic-eng | Finetuned LMs | XLM-RoBERTa-Large | 69.7215 |
Sentiment-eng-Pang | sentiment-2005-pang-eng | Zero-shot | BLOOMZ-7B | 97.2 |
Sentiment-eng-Rosenthal | sentiment-2017-rosenthal-eng | Finetuned LMs | Bernice | 70.8997 |
Sentiment-eng-Socher | sentiment-2013-socher-eng | Zero-shot | BLOOMZ-P3-7B | 93 |
Sentiment-fas-Ashrafi | sentiment-2020-ashrafi-fas | Finetuned LM | FaBERT | 87.5152 |
Sentiment-fin-Kajava | sentiment-2018-kajava-fin | Finetuned LMs | XLM-RoBERTa-Large | 86.949 |
Sentiment-fre-Kajava | sentiment-2018-kajava-fre | Finetuned LMs | XLM-RoBERTa-Large | 88.701 |
Sentiment-hau-Muhammad | sentiment-2022-muhammad-hau | Finetuned LMs | XLM-RoBERTa-Large | 75.0646 |
Sentiment-heb-Amram | sentiment-2018-amram-heb | Finetuned LMs | InfoDCL | 95.8 |
Sentiment-hin-Patra | sentiment-2018-patra-hin | Finetuned LMs | XLM-RoBERTa-Large | 62.3897 |
Sentiment-hrv-Mozetic | sentiment-2016-mozetic-hrv | Finetuned LMs | XLM-RoBERTa-Large | 71.0063 |
Sentiment-hun-Mozetic | sentiment-2016-mozetic-hun | Finetuned LMs | XLM-RoBERTa-Base | 71.6167 |
Sentiment-ibo-Muhammad | sentiment-2022-muhammad-ibo | Finetuned LMs | Bernice | 78.3577 |
Sentiment-ind-Wilie | sentiment-2020-wilie-ind | Finetuned LMs | InfoDCL | 92.6987 |
Sentiment-ita-Basile | sentiment-2014-basile-ita | Finetuned LMs | InfoDCL | 88.7675 |
Sentiment-ita-Basile | sentiment-2016-basile-ita | Zero-shot | Chatgpt with translated prompts | 85.4423 |
Sentiment-ita-Kajava | sentiment-2018-kajava-ita | Finetuned LMs | InfoDCL | 86.1687 |
Sentiment-ita-Rei | sentiment-2016-rei-ita | Finetuned LMs | InfoDCL | 51.1953 |
Sentiment-jav-Winata | sentiment-2022-winata-jav | Finetuned LMs | XLM-RoBERTa-Large | 86.9069 |
Sentiment-jpn-Suzuki | sentiment-2022-suzuki-jpn | Finetuned LMs | XLM-RoBERTa-Large | 62.3333 |
Sentiment-kan-Chakravarthi | sentiment-2022-chakravarthi-ka | Finetuned LMs | InfoDCL | 82.9212 |
Sentiment-kin-Muhammad | sentiment-2022-muhammad-kin | Finetuned LMs | TwHIN-BERT | 59.8569 |
Sentiment-kor-Jang | sentiment-2013-jang-kor | Finetuned LMs | XLM-RoBERTa-Large | 45.4451 |
Sentiment-mad-Winata | sentiment-2022-winata-mad | Finetuned LMs | InfoDCL | 78.3642 |
Sentiment-mal-Chakravarthi | sentiment-2022-chakravarthi-ma | Finetuned LMs | XLM-RoBERTa-Large | 86.4498 |
Sentiment-mar-Kulkarni | sentiment-2021-kulkarni-mar | Finetuned LMs | InfoDCL | 86.4667 |
Sentiment-min-Winata | sentiment-2022-winata-min | Finetuned LMs | XLM-RoBERTa-Large | 84.2426 |
Sentiment-mlt-Dingli | sentiment-2016-dingli-mlt | Zero-shot | Chatgpt | 78.4689 |
Sentiment-nij-Winata | sentiment-2022-winata-nij | Finetuned LMs | XLM-RoBERTa-Large | 78.3135 |
Sentiment-nor-Velldal | sentiment-2018-velldal-nor | Finetuned LMs | XLM-RoBERTa-Base | 51.1497 |
Sentiment-pcm-Muhammad | sentiment-2022-muhammad-pcm | Zero-shot | Chatgpt | 70.0789 |
Sentiment-pcm-Oyewusi | sentiment-2020-oyewusi-pcm | Finetuned LMs | XLM-RoBERTa-Large | 69.5089 |
Sentiment-pol-Kocon | sentiment-2019-kocon-pol | Finetuned LMs | XLM-RoBERTa-Large | 82.6395 |
Sentiment-pol-Mozetic | sentiment-2016-mozetic-pol | Finetuned LMs | XLM-RoBERTa-Large | 70.0465 |
Sentiment-pol-Rybak | sentiment-2020-rybak-pol | Finetuned LMs | XLM-RoBERTa-Large | 55.6706 |
Sentiment-por-Brum | sentiment-2018-brum-por | Finetuned LMs | XLM-RoBERTa-Base | 60.0005 |
Sentiment-por-Mozetic | sentiment-2016-mozetic-por | Finetuned LMs | XLM-RoBERTa-Large | 57.2246 |
Sentiment-ron-Dumitrescu | sentiment-2020-dumitrescu-ron | Zero-shot | Chatgpt with translated prompts | 93.9182 |
Sentiment-rus-Mozetic | sentiment-2016-mozetic-rus | Finetuned LMs | InfoDCL | 80.3694 |
Sentiment-slk-Mozetic | sentiment-2016-mozetic-slk | Finetuned LMs | XLM-RoBERTa-Large | 75.9312 |
Sentiment-slv-Mozetic | sentiment-2016-mozetic-slv | Finetuned LMs | XLM-RoBERTa-Large | 63.8495 |
Sentiment-spa-Mozetic | sentiment-2016-mozetic-spa | Finetuned LMs | XLM-Twitter | 57.1888 |
Sentiment-spa-Rei | sentiment-2016-rei-spa | Finetuned LMs | InfoDCL | 52.827 |
Sentiment-sqi-Mozetic | sentiment-2016-mozetic-sqi | Finetuned LMs | XLM-RoBERTa-Large | 49.537 |
Sentiment-srp-Mozetic | sentiment-2016-mozetic-srp | Finetuned LMs | XLM-RoBERTa-Large | 58.35 |
Sentiment-sun-Winata | sentiment-2022-winata-sun | Finetuned LMs | XLM-RoBERTa-Large | 84.4773 |
Sentiment-swe-Mozetic | sentiment-2016-mozetic-swe | Finetuned LMs | XLM-RoBERTa-Large | 73.1889 |
Sentiment-swh-Muhammad | sentiment-2022-muhammad-swh | Finetuned LMs | XLM-RoBERTa-Large | 62.687 |
Sentiment-tam-Chakravarthi | sentiment-2022-chakravarthi-ta | Finetuned LMs | XLM-RoBERTa-Large | 77.4016 |
Sentiment-tel-Marreddy | sentiment-2022-marreddy-tel | Finetuned LMs | XLM-RoBERTa-Large | 71.5399 |
Sentiment-tha-Suriyawongkul | sentiment-2019-suriyawongkul-t | Finetuned LMs | InfoDCL | 75.1656 |
Sentiment-tso-Muhammad | sentiment-2022-muhammad-tso | Finetuned LMs | XLM-Twitter | 53.7627 |
Sentiment-Tw-eng-Thelwall | sentiment-tw-2012-thelwall-eng | Finetuned LMs | Bernice | 91.4 |
Sentiment-twi-Muhammad | sentiment-2022-muhammad-twi | Finetuned LMs | XLM-RoBERTa-Large | 66.7873 |
Sentiment-yor-Muhammad | sentiment-2022-muhammad-yor | Finetuned LMs | mBERT | 67.8732 |
Sentiment-yor-Shode | sentiment-2022-shode-yor | Finetuned LMs | mBERT | 86.8636 |
Sentiment-Yt-eng-Thelwall | sentiment-yt-2012-thelwall-eng | Finetuned LMs | InfoDCL | 93.3333 |
Sentiment-zho-Tan | sentiment-2008-tan-zho | Finetuned LMs | Bernice | 96.8541 |
Sentiment-zho-Wan | sentiment-2020-wan-zho | Finetuned LMs | XLM-RoBERTa-Base | 98.9997 |
Sentiment-zho-Wan | sentiment-2020-wan-zho | Finetuned LMs | XLM-Twitter | 98.9997 |
Sentiment-zho-Wan | sentiment-2020-wan-zho | Finetuned LMs | TwHIN-BERT | 98.9997 |
Sentiment-zho-Wan | sentiment-2020-wan-zho | Finetuned LMs | XLM-RoBERTa-Large | 98.9997 |
Sexism-fre-Chiril | sexism-2020-chiril-fre | Finetuned LMs | XLM-RoBERTa-Large | 84.0856 |
Subject-ita-Basile | subject-2014-basile-ita | Finetuned LMs | InfoDCL | 80.1589 |
Subject-ita-Basile | subject-2016-basile-ita | Finetuned LMs | XLM-RoBERTa-Large | 75.9493 |
Subjective-ces-Priban | subjective-2022-priban-ces | Finetuned LMs | XLM-RoBERTa-Large | 93.6 |
Subjective-kor-Jang | subjective-2013-jang-kor | Finetuned LMs | XLM-RoBERTa-Large | 38.447 |
Subjective-spa-Barbieri | subjective-2016-barbieri-spa | Finetuned LMs | XLM-RoBERTa-Large | 77.7599 |
Subjevtive-eng-Pang | subjevtive-2004-pang-eng | Finetuned LMs | XLM-RoBERTa-Large | 96.5333 |