题名

Sentence BERT語意分析模型簡介

并列篇名

A Brief Introduction to Sentence BERT Semantic Analysis Model

作者

陳嘉浩(Chen, Chia-Hao);官長治(Kuan, Chang-Chih)

关键词

公文電子交換系統 ; 服務型智慧政府 ; 人工智慧 ; Service for Passing and Exchanging Electronic Documents (SPEED) ; service-oriented smart government ; artificial intelligent (AI)

期刊名称

檔案半年刊

卷期/出版年月

21卷2期(2022 / 12 / 01)

页次

88 - 105

内容语文

英文

中文摘要

人工智慧技術在近年因為電腦相關運算處理能力大幅提升而有了長足的進步。行政院亦於民國109年8月3日核定「服務型智慧政府2.0推動計畫」推動策略,其策略之一為深化新興科技應用之智慧服務,而國家發展委員會檔案管理局「110至113年檔案服務宣言」核心亦包含運用人工智慧技術,進行語意網路模型、關鍵字詞分析及後分類檢索,建構智慧語意網路機制而規劃相關應用服務。文書及檔案管理資訊系統之客戶服務自然也需因應趨勢,提供相關的智能客戶服務。本文將介紹一套由Reimers及Gurevych(2019)提出之語意分析模型-Sentence BERT,並以一套簡易文字客服實作案例的方式說明該模型之應用,並提出相關建議,期可提供機關以此模型開發相關應用之參考。

英文摘要

Artificial intelligence (AI) technology has made great progress in recent years due to the substantial increase in computer-related computing processing capabilities. The Executive Yuan also approved the promotion strategy of the "Service-oriented Smart Government 2.0 Promotion Plan" on August 3, 2010. One of its strategies is to strengthen the application of intelligent service developed with emerging technology. "2021-2024 Archives Service Manifesto" proposed by National Archives Administration, National Development Council also includes the use of artificial intelligence technology to carry out semantic network model, keyword analysis and post-category retrieval, to construct a smart semantic network mechanism and plan related application services. The customer services of electronic document systems also needs to follow the trend and provide relevant intelligent customer services. This article will introduce a semantic analysis model - Sentence BERT proposed by Reimers and Gurevych (2019). An implementation case of customer services is provided to illustrate the use of the application of the model. Also, suggestions on model development were proposed for reference.

主题分类 人文學 > 圖書資訊學
参考文献
  1. SBERT.net website (n.d.). Pretrained Models. Retrieved from https://www.sbert.net/docs/pretrained_models.html (Dec. 15, 2021)
  2. Bahdanau, D.,Cho, K.,Bengio, Y.(2014).Neural machine translation by jointly learning to align and translate.ICLR 2015
  3. Chollet, F.(2021).Deep learning with Python.New York, U.S.A.:Manning.
  4. Devlin, J.,Chang, M. W.,Lee, K.,Toutanova, K.(2018).,未出版
  5. Hochreiter, S.,Schmidhuber, J.(1997).Long short-term memory.Neural computation,9(8),1735-1780.
  6. Mikolov, T.,Chen, K.,Corrado, G.,Dean, J.(2013).,未出版
  7. Pennington, J.,Socher, R.,Manning, C. D.(2014).Glove: Global vectors for word representation.Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP)
  8. Reimers, N.,Gurevych, I.(2019).Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks.EMNLP 2019
  9. Schuster, M.,Paliwal, K. K.(1997).Bidirectional recurrent neural networks.IEEE transactions on Signal Processing,45(11),2673-2681.
  10. Shaw, P.,Uszkoreit, J.,Vaswani, A.(2018).Self-attention with relative position representations.NAACL 2018
  11. Vaswani, A.,Shazeer, N.M.,Parmar, N.,Uszkoreit, J.,Jones, L.,Gomez, A.N.,Polosukhin, I.(2017).Attention is All you Need.31st Conference on Neural Information Processing Systems (NIPS 2017)
  12. 王怡萱(2020)。臺南市,國立成功大學。
  13. 王若樸(2021)。疫情假新聞滿天飛,Line 如何用 AI 加速事實查核? Retrieved from https://www.ithome.com.tw/news/145753 (Dec. 15, 2021)