1、問題的提出:the sentiment polarity of a sentence is highly dependent on both content and aspect. For example, the sentiment polarity of “Staffs are not that friendly, but the taste covers all.” will be positive if the aspect is food but negative when considering the aspect service.
介紹兩個概念:
Aspect-level sentiment classification :給定一個句子和句子中出現的某個Aspect,Aspect-level的目標是分析出這個句子在給定Aspect上的情感傾向。aspect-level的情感分析相對document level來說粒度更細。
Attention-based(注意力機制):最早是在計算機視覺圖像領域提出來的,思想是人們在進行觀察圖像的時候,并不是一次就把整幅圖像的每個位置像素都看過,大多樹會根據自己的需要,將注意力集中到圖像的特定部分。在本文中,作者說,The attention mechanism can concentrate on different parts of a sentence when different aspects are taken as input.
2、主要貢獻:
We propose attention-based Long Short-Term memory for aspect-level sentiment classification. The models are able to attend different parts of a sentence when different aspects are concerned. Results show that the attention mechanism is effective.(提出了基于注意力機制的長短期記憶網絡,用于方面級別的情感分類。當涉及不同的方面時,這個模型能夠注意到句子的不同部分)
Since aspect plays a key role in this task, we propose two ways to take into account aspect information during attention: one way is to concatenate the aspect vector into the sentence hidden representations for computing attention weights, and another way is to additionally append the aspect vector into the input word vectors.(提出兩種方法加入aspect信息:一種是在隱藏層加,另一種是在輸入層加)
3、在本文中作者主要提出了三種模型:LSTM with Aspect Embedding(AE-LSTM)、Attention-based LSTM(AT-LSTM)、Attention-based LSTM with Aspect Embedding(ATAE-LSTM )
α是注意力權重向量,r是具有給定方面的句子的加權表示。Va 是aspect embedding (方面的詞向量)。
最終句子的表示是:
? ? ?hN是隱藏層的最后一層。h ? is considered as the feature representation of a sentence given an input aspect.
然后作者添加了一個linear layer 將句子向量轉為向量e,e的長度與最終分類的類別數目相等。然后輸入到softmax層。softmax層的作用是求出屬于每個類別的的概率。
4、本文的優化目標是交叉熵,學習方法是隨機梯度下降,詞向量是用Twitter數據經GloVe預訓練的。
數據集:We experiment on the dataset of SemEval 2014 Task4 ,The dataset consists of customers reviews.
任務定義:
任務1,Aspect-level Classification:Given a set of preidentified aspects, this task is to determine the polarity of each aspect.(指定一組aspect,確定每個aspect上面的情感極性)
任務2,Aspect-Term-level Classification :For a given set of aspects term within a sentence, this task is to determine whether the polarity of each aspect term is positive, negative or neutral.
結果:
5、說明
數據格式:
任務1的aspect 指的是aspectCategory,任務2的aspect指的是aspectTerm。
論文作者在寫這邊文章的時候主要和Tang etal ..,2015a這篇文章做了比較。Tang的文章采用LSTM解決一個句子針對Target情感分類問題。此時的Target指的是aspectTerm
注意力機制,作者使用可視化工具可視化了α,結果如下所示;
6、參考文獻
Attention-based LSTM for Aspect-level Sentiment Classification 作者Yequan Wang and Minlie Huang and Li Zhao* and Xiaoyan Zhu