基于Label Studio与LLM的电力行业通用文本信息智能标注方法
摘要
关键词
全文:
PDF参考
[1]丁泓馨,邹佩聂,赵俊峰,等.一种基于主动学习的文本实体与关系联合抽取方法[J].计算机科学,2023,50(10):126-134.
[2]梅紫华.面向智慧法院的裁判文书智能标注方法研究及系统实现[D].重庆大学,2022.DOI:10.27670/d.cnki.gcqdu.2022.003769.
[3]任俊飞,朱桐,陈文亮.基于部分标注的自训练多标签文本分类框架[J].清华大学学报(自然科学版),2024,64(04):679-687.DOI:10.16511/j.cnki.qhdxxb.2024.21.006.
[4]李瑛,耿军伟,赵留学,等.基于ChineseBERT和多特征协同网络的电力设备缺陷文本分类模型[J].微型电脑应用,2024,40(02):106-109.
[5]Sun, Yu, et al. “ERNIE 3.0: Large-Scale Knowledge Enhanced Pre-Training for Language Understanding and Generation.” arXiv, 5 July 2021, arXiv:2107.02137.
[6]Kim, Yoon. “Convolutional Neural Networks for Sentence Classification.” arXiv, 25 Aug. 2014, arxiv.org/abs/1408.5882.
[7]Mikolov, Tomas, et al. “Efficient Estimation of Word Representations in Vector Space.” arXiv, 16 Jan. 2013, arxiv.org/abs/1301.3781.
[8]Pennington, Jeffrey, et al. “GloVe: Global Vectors for Word Representation.” Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics, 2014, pp. 1532-1543.
[9]Liu, Pengfei, et al. “Recurrent Neural Network for Text Classification with Multi-Task Learning.” arXiv:1605.05101.
[10]Hochreiter, Sepp, and Jürgen Schmidhuber. “Long Short-Term Memory.” Neural Computation, vol. 9, no. 8, 1997, pp. 1735-1780.
[11]Devlin, Jacob, et al. “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.” arXiv Preprint arXiv:1810.04805, 2019.
Refbacks
- 当前没有refback。