模型:
invokerliang/MWP-BERT-zh
NAACL 2022论文结果:MWP-BERT:为数学问题解决增强数值预训练
Github链接: https://github.com/LZhenwen/MWP-BERT/
请使用“hfl/chinese-bert-wwm-ext”模型的分词器。
@inproceedings{liang2022mwp,
title={MWP-BERT: Numeracy-Augmented Pre-training for Math Word Problem Solving},
author={Liang, Zhenwen and Zhang, Jipeng and Wang, Lei and Qin, Wei and Lan, Yunshi and Shao, Jie and Zhang, Xiangliang},
booktitle={Findings of NAACL 2022},
pages={997--1009},
year={2022}
}