AUTHOR=Church Kenneth , Liu Boxiang TITLE=Acronyms and Opportunities for Improving Deep Nets JOURNAL=Frontiers in Artificial Intelligence VOLUME=4 YEAR=2021 URL=https://www.frontiersin.org/journals/artificial-intelligence/articles/10.3389/frai.2021.732381 DOI=10.3389/frai.2021.732381 ISSN=2624-8212 ABSTRACT=
Recently, several studies have reported promising results with BERT-like methods on acronym tasks. In this study, we find an older rule-based program, Ab3P, not only performs better, but error analysis suggests why. There is a well-known spelling convention in acronyms where each letter in the short form (SF) refers to “salient” letters in the long form (LF). The error analysis uses decision trees and logistic regression to show that there is an opportunity for many pre-trained models (BERT, T5, BioBert, BART, ERNIE) to take advantage of this spelling convention.