Hanyu Guo
Hanyu Guo

Reputation: 717

TensorFlow implementing Seq2seq Sentiment analysis

I'm currently playing with Tensorflow Seq2seq model, trying to implement sentiment analysis. My idea is to feed the encoder with IMDB comment, the decoder with [Pad] or [Go] and the target with [neg]/[pos]. Most of my code is quite similar with the example of seq2seq translation. But the result I get is quite strange. For each batch, the results are either all [neg] or all [pos].

"encoder input : I was hooked almost immediately.[pad][pad][pad]"

"decoder input : [pad]"

"target : [pos]"

Since this result is very particular, I was wondering if anyone knows what would lead to this kind of thing?

Upvotes: 3

Views: 1699

Answers (1)

ilblackdragon
ilblackdragon

Reputation: 1835

I would recommend to try using a simpler architecture - RNN or CNN encoder that feeds into logistic classifier. This architectures has been showing very good results on sentiment analysis (amazon reviews, yelp reviews, etc).

For examples of such models, you can see here - various encoders (LSTM or Convolution) on words and characters.

Upvotes: 3

Related Questions