模型:
avichr/heBERT_NER
HeBERT is a Hebrew pretrained language model. It is based on Google's BERT architecture and it is BERT-Base config.
HeBert was trained on three dataset:
The ability of the model to classify named entities in text, such as persons' names, organizations, and locations; tested on a labeled dataset from Ben Mordecai and M Elhadad (2005) , and evaluated with F1-score.
from transformers import pipeline # how to use? NER = pipeline( "token-classification", model="avichr/heBERT_NER", tokenizer="avichr/heBERT_NER", ) NER('דויד לומד באוניברסיטה העברית שבירושלים')
Emotion Recognition Model . An online model can be found at huggingface spaces or as colab notebook Sentiment Analysis . masked-LM model (can be fine-tunned to any down-stream task).
Avichay Chriqui Inbal yahav The Coller Semitic Languages AI Lab Thank you, תודה, شكرا
Chriqui, A., & Yahav, I. (2021). HeBERT & HebEMO: a Hebrew BERT Model and a Tool for Polarity Analysis and Emotion Recognition. arXiv preprint arXiv:2102.01909.
@article{chriqui2021hebert, title={HeBERT \& HebEMO: a Hebrew BERT Model and a Tool for Polarity Analysis and Emotion Recognition}, author={Chriqui, Avihay and Yahav, Inbal}, journal={arXiv preprint arXiv:2102.01909}, year={2021} }