TY - UNPB
T1 - Introducing BEREL
T2 - BERT Embeddings for Rabbinic-Encoded Language
AU - Shmidman, Avi
AU - Guedalia, Joshua
AU - Shmidman, Shaltiel
AU - Shmidman, Cheyn Shmuel
AU - Handel, Eli
AU - Koppel, Moshe
PY - 2022/8/3
Y1 - 2022/8/3
N2 - We present a new pre-trained language model (PLM) for Rabbinic Hebrew, termed Berel (BERT Embeddings for Rabbinic-Encoded Language). Whilst other PLMs exist for processing Hebrew texts (e.g., HeBERT, AlephBert), they are all trained on modern Hebrew texts, which diverges substantially from Rabbinic Hebrew in terms of its lexicographical, morphological, syntactic and orthographic norms. We demonstrate the superiority of Berel on Rabbinic texts via a challenge set of Hebrew homographs. We release the new model and homograph challenge set for unrestricted use.
AB - We present a new pre-trained language model (PLM) for Rabbinic Hebrew, termed Berel (BERT Embeddings for Rabbinic-Encoded Language). Whilst other PLMs exist for processing Hebrew texts (e.g., HeBERT, AlephBert), they are all trained on modern Hebrew texts, which diverges substantially from Rabbinic Hebrew in terms of its lexicographical, morphological, syntactic and orthographic norms. We demonstrate the superiority of Berel on Rabbinic texts via a challenge set of Hebrew homographs. We release the new model and homograph challenge set for unrestricted use.
KW - cs.CL
UR - https://arxiv.org/abs/2208.01875
U2 - 10.48550/arXiv.2208.01875
DO - 10.48550/arXiv.2208.01875
M3 - نسخة اولية
BT - Introducing BEREL
ER -