Max’s open-source projects are supported by his Patreon. If you found this project helpful, any monetary contributions to the Patreon are appreciated and will be put to good creative use.

Intro

reactionrnn is a Python module on top of Keras/TensorFlow which can easily predict the proportionate reactions (love, wow, haha, sad, angry) to a given text using a pretrained recurrent neural network:

library(reactionrnn)
Loading required package: keras
react <- reactionrnn()
2017-08-17 14:45:41.723904: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations.
2017-08-17 14:45:41.723923: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
2017-08-17 14:45:41.723930: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
2017-08-17 14:45:41.723936: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use FMA instructions, but these are available on your machine and could speed up CPU computations.

The reactionrnn initializer builds a normal Keras model using pretrained weights, and can be accessed using the $model property:

react$model %>% summary()
Model
_______________________________________________________________________________________________________________
Layer (type)                                     Output Shape                                 Param #          
===============================================================================================================
input (InputLayer)                               (None, 140)                                  0                
_______________________________________________________________________________________________________________
embedding (Embedding)                            (None, 140, 100)                             40100            
_______________________________________________________________________________________________________________
rnn (GRU)                                        (None, 256)                                  274176           
_______________________________________________________________________________________________________________
output (Dense)                                   (None, 5)                                    1285             
===============================================================================================================
Total params: 315,561
Trainable params: 315,561
Non-trainable params: 0
_______________________________________________________________________________________________________________

 
(react$model %>% get_layer('rnn') %>% get_weights())[[3]][1:10]
 [1] 0.24647443 0.92739904 0.20259078 0.02004059 0.35049984 0.21181674 0.18195845 0.59879297 0.20810503
[10] 0.14619187

Predict Reaction to Single Text

The predict function returns an named vector of the five predicted reactions to the given text (if input is a single string or 1-element character list/vector), sorted from strongest to weakest.

NOTE: Emoji examples ommited because R Notebook chunk processing breaks if a raw emoji is present. Emojis will still behave as expected in R usage otherwise.

prediction <- react %>% predict("Happy Mother's Day from the Chicago Cubs!")
prediction
      love        wow       haha        sad      angry 
0.97649449 0.02350551 0.00000000 0.00000000 0.00000000 
react %>% predict("He was only 41.")
  sad  love   wow  haha angry 
    1     0     0     0     0 
react %>% predict("Everyone loves autoplaying videos!")
    angry       wow      love      haha       sad 
0.8667157 0.1332843 0.0000000 0.0000000 0.0000000 

Like a normal named vector, you can access the value by key.

prediction['wow']
       wow 
0.02350551 

If you just want the strongest predicted label for a given text(s), you can access that with predict_label. (good for classification)

react %>% predict_label("Happy Mother's Day from the Chicago Cubs!")
[1] "love"
react %>% predict_label(c("He was only 41.", "Everyone loves autoplaying videos!"))
[1] "sad"   "angry"

Predict Reaction to Multiple Texts

If you provide a list/vector of texts to predict(), the return value is a data.frame of (n, 5), where n is the number of texts.

texts <- c('Never gonna give you up, never gonna let you down',
            'Never gonna run around and desert you',
            'Never gonna make you cry, never gonna say goodbye',
            'Never gonna tell a lie and hurt you')
react %>% predict(texts)

Encode Text as Vector

You can also encode text as a 256D vector, which unlike those produced by word2vec/doc2vec/fasttext, incorporates the information provided by modern grammar including punctuation and emoji.

encoding <- react %>% encode("DYING.")
encoding[1, 1:5]
encoding %>% dim()
[1]   1 256

You can also encode multiple texts as vectors.

encoding <- react %>% encode(texts)
encoding[, 1:5]
encoding %>% dim()
[1]   4 256

I strongly recommend using PCA to both reduce the high dimensionality of the text and align the given encoded texts in context of each other.

pca <- prcomp(encoding)
(pca %>% predict(encoding))[,1:2]
           PC1        PC2
[1,] -1.142328  3.7282753
[2,] -1.655522 -1.5594023
[3,]  4.582208 -0.3444179
[4,] -1.784358 -1.8244551
explained_variance <- pca$sdev ^ 2/ sum(pca$sdev ^ 2)
explained_variance
[1] 5.285303e-01 3.703582e-01 1.011114e-01 7.964879e-32
sum(explained_variance[1:2])
[1] 0.8988886

These 2 components explain about 90% of the variation in the text; something which would hard to achieve without overfitting on only 4 text documents!

LICENSE

MIT License

Copyright (c) 2017 Max Woolf

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

LS0tCnRpdGxlOiAicmVhY3Rpb25ybm4gRGVtbyIKYXV0aG9yOiAiTWF4IFdvb2xmIChAbWluaW1heGlyKSIKZGF0ZTogIjIwMTctMDgtMTgiCm91dHB1dDoKICBodG1sX25vdGVib29rOgogICAgaGlnaGxpZ2h0OiB0YW5nbwogICAgbWF0aGpheDogbnVsbAotLS0KCipNYXgncyBvcGVuLXNvdXJjZSBwcm9qZWN0cyBhcmUgc3VwcG9ydGVkIGJ5IGhpcyBbUGF0cmVvbl0oaHR0cHM6Ly93d3cucGF0cmVvbi5jb20vbWluaW1heGlyKS4gSWYgeW91IGZvdW5kIHRoaXMgcHJvamVjdCBoZWxwZnVsLCBhbnkgbW9uZXRhcnkgY29udHJpYnV0aW9ucyB0byB0aGUgUGF0cmVvbiBhcmUgYXBwcmVjaWF0ZWQgYW5kIHdpbGwgYmUgcHV0IHRvIGdvb2QgY3JlYXRpdmUgdXNlLioKCiMjIEludHJvCgpyZWFjdGlvbnJubiBpcyBhIFB5dGhvbiBtb2R1bGUgb24gdG9wIG9mIEtlcmFzL1RlbnNvckZsb3cgd2hpY2ggY2FuIGVhc2lseSBwcmVkaWN0IHRoZSBwcm9wb3J0aW9uYXRlIHJlYWN0aW9ucyAobG92ZSwgd293LCBoYWhhLCBzYWQsIGFuZ3J5KSB0byBhIGdpdmVuIHRleHQgdXNpbmcgYSBwcmV0cmFpbmVkIHJlY3VycmVudCBuZXVyYWwgbmV0d29yazoKCmBgYHtyfQpsaWJyYXJ5KHJlYWN0aW9ucm5uKQoKcmVhY3QgPC0gcmVhY3Rpb25ybm4oKQpgYGAKClRoZSBgcmVhY3Rpb25ybm5gIGluaXRpYWxpemVyIGJ1aWxkcyBhIG5vcm1hbCBLZXJhcyBtb2RlbCB1c2luZyBwcmV0cmFpbmVkIHdlaWdodHMsIGFuZCBjYW4gYmUgYWNjZXNzZWQgdXNpbmcgdGhlIGAkbW9kZWxgIHByb3BlcnR5OgoKYGBge3J9CnJlYWN0JG1vZGVsICU+JSBzdW1tYXJ5KCkKKHJlYWN0JG1vZGVsICU+JSBnZXRfbGF5ZXIoJ3JubicpICU+JSBnZXRfd2VpZ2h0cygpKVtbM11dWzE6MTBdCmBgYAoKIyMgUHJlZGljdCBSZWFjdGlvbiB0byBTaW5nbGUgVGV4dAoKVGhlIGBwcmVkaWN0YCBmdW5jdGlvbiByZXR1cm5zIGFuIG5hbWVkIHZlY3RvciBvZiB0aGUgZml2ZSBwcmVkaWN0ZWQgcmVhY3Rpb25zIHRvIHRoZSBnaXZlbiB0ZXh0IChpZiBpbnB1dCBpcyBhIHNpbmdsZSBzdHJpbmcgb3IgMS1lbGVtZW50IGNoYXJhY3RlciBsaXN0L3ZlY3RvciksIHNvcnRlZCBmcm9tIHN0cm9uZ2VzdCB0byB3ZWFrZXN0LgoKKipOT1RFOiBFbW9qaSBleGFtcGxlcyBvbW1pdGVkIGJlY2F1c2UgUiBOb3RlYm9vayBjaHVuayBwcm9jZXNzaW5nIGJyZWFrcyBpZiBhIHJhdyBlbW9qaSBpcyBwcmVzZW50LiBFbW9qaXMgd2lsbCBzdGlsbCBiZWhhdmUgYXMgZXhwZWN0ZWQgaW4gUiB1c2FnZSBvdGhlcndpc2UuKioKCmBgYHtyfQpwcmVkaWN0aW9uIDwtIHJlYWN0ICU+JSBwcmVkaWN0KCJIYXBweSBNb3RoZXIncyBEYXkgZnJvbSB0aGUgQ2hpY2FnbyBDdWJzISIpCnByZWRpY3Rpb24KYGBgCgpgYGB7cn0KcmVhY3QgJT4lIHByZWRpY3QoIkhlIHdhcyBvbmx5IDQxLiIpCnJlYWN0ICU+JSBwcmVkaWN0KCJFdmVyeW9uZSBsb3ZlcyBhdXRvcGxheWluZyB2aWRlb3MhIikKYGBgCgpMaWtlIGEgbm9ybWFsIG5hbWVkIHZlY3RvciwgeW91IGNhbiBhY2Nlc3MgdGhlIHZhbHVlIGJ5IGtleS4KCmBgYHtyfQpwcmVkaWN0aW9uWyd3b3cnXQpgYGAKCklmIHlvdSBqdXN0IHdhbnQgdGhlIHN0cm9uZ2VzdCBwcmVkaWN0ZWQgbGFiZWwgZm9yIGEgZ2l2ZW4gdGV4dChzKSwgeW91IGNhbiBhY2Nlc3MgdGhhdCB3aXRoIGBwcmVkaWN0X2xhYmVsYC4gKGdvb2QgZm9yIGNsYXNzaWZpY2F0aW9uKQoKYGBge3J9CnJlYWN0ICU+JSBwcmVkaWN0X2xhYmVsKCJIYXBweSBNb3RoZXIncyBEYXkgZnJvbSB0aGUgQ2hpY2FnbyBDdWJzISIpCmBgYAoKYGBge3J9CnJlYWN0ICU+JSBwcmVkaWN0X2xhYmVsKGMoIkhlIHdhcyBvbmx5IDQxLiIsICJFdmVyeW9uZSBsb3ZlcyBhdXRvcGxheWluZyB2aWRlb3MhIikpCmBgYAoKCgojIyBQcmVkaWN0IFJlYWN0aW9uIHRvIE11bHRpcGxlIFRleHRzCgpJZiB5b3UgcHJvdmlkZSBhIGBsaXN0YC9gdmVjdG9yYCBvZiB0ZXh0cyB0byBgcHJlZGljdCgpYCwgdGhlIHJldHVybiB2YWx1ZSBpcyBhIGBkYXRhLmZyYW1lYCBvZiAoYG5gLCA1KSwgd2hlcmUgYG5gIGlzIHRoZSBudW1iZXIgb2YgdGV4dHMuCgpgYGB7cn0KdGV4dHMgPC0gYygnTmV2ZXIgZ29ubmEgZ2l2ZSB5b3UgdXAsIG5ldmVyIGdvbm5hIGxldCB5b3UgZG93bicsCiAgICAgICAgICAgICdOZXZlciBnb25uYSBydW4gYXJvdW5kIGFuZCBkZXNlcnQgeW91JywKICAgICAgICAgICAgJ05ldmVyIGdvbm5hIG1ha2UgeW91IGNyeSwgbmV2ZXIgZ29ubmEgc2F5IGdvb2RieWUnLAogICAgICAgICAgICAnTmV2ZXIgZ29ubmEgdGVsbCBhIGxpZSBhbmQgaHVydCB5b3UnKQoKcmVhY3QgJT4lIHByZWRpY3QodGV4dHMpCmBgYAoKIyMgRW5jb2RlIFRleHQgYXMgVmVjdG9yCgpZb3UgY2FuIGFsc28gZW5jb2RlIHRleHQgYXMgYSAyNTZEIHZlY3Rvciwgd2hpY2ggdW5saWtlIHRob3NlIHByb2R1Y2VkIGJ5IHdvcmQydmVjL2RvYzJ2ZWMvZmFzdHRleHQsIGluY29ycG9yYXRlcyB0aGUgaW5mb3JtYXRpb24gcHJvdmlkZWQgYnkgbW9kZXJuIGdyYW1tYXIgaW5jbHVkaW5nIHB1bmN0dWF0aW9uIGFuZCBlbW9qaS4KCmBgYHtyfQplbmNvZGluZyA8LSByZWFjdCAlPiUgZW5jb2RlKCJEWUlORy4iKQplbmNvZGluZ1sxLCAxOjVdCmVuY29kaW5nICU+JSBkaW0oKQpgYGAKCllvdSBjYW4gYWxzbyBlbmNvZGUgbXVsdGlwbGUgdGV4dHMgYXMgdmVjdG9ycy4KCgpgYGB7cn0KZW5jb2RpbmcgPC0gcmVhY3QgJT4lIGVuY29kZSh0ZXh0cykKZW5jb2RpbmdbLCAxOjVdCmVuY29kaW5nICU+JSBkaW0oKQpgYGAKCkkgc3Ryb25nbHkgcmVjb21tZW5kIHVzaW5nIFBDQSB0byBib3RoIHJlZHVjZSB0aGUgaGlnaCBkaW1lbnNpb25hbGl0eSBvZiB0aGUgdGV4dCBhbmQgYWxpZ24gdGhlIGdpdmVuIGVuY29kZWQgdGV4dHMgaW4gY29udGV4dCBvZiBlYWNoIG90aGVyLgoKYGBge3J9CnBjYSA8LSBwcmNvbXAoZW5jb2RpbmcpCihwY2EgJT4lIHByZWRpY3QoZW5jb2RpbmcpKVssMToyXQpgYGAKCmBgYHtyfQpleHBsYWluZWRfdmFyaWFuY2UgPC0gcGNhJHNkZXYgXiAyLyBzdW0ocGNhJHNkZXYgXiAyKQpleHBsYWluZWRfdmFyaWFuY2UKc3VtKGV4cGxhaW5lZF92YXJpYW5jZVsxOjJdKQpgYGAKClRoZXNlIDIgY29tcG9uZW50cyBleHBsYWluIGFib3V0IDkwJSBvZiB0aGUgdmFyaWF0aW9uIGluIHRoZSB0ZXh0OyBzb21ldGhpbmcgd2hpY2ggd291bGQgaGFyZCB0byBhY2hpZXZlIHdpdGhvdXQgb3ZlcmZpdHRpbmcgb24gb25seSA0IHRleHQgZG9jdW1lbnRzIQoKIyBMSUNFTlNFCgpNSVQgTGljZW5zZQoKQ29weXJpZ2h0IChjKSAyMDE3IE1heCBXb29sZgoKUGVybWlzc2lvbiBpcyBoZXJlYnkgZ3JhbnRlZCwgZnJlZSBvZiBjaGFyZ2UsIHRvIGFueSBwZXJzb24gb2J0YWluaW5nIGEgY29weQpvZiB0aGlzIHNvZnR3YXJlIGFuZCBhc3NvY2lhdGVkIGRvY3VtZW50YXRpb24gZmlsZXMgKHRoZSAiU29mdHdhcmUiKSwgdG8gZGVhbAppbiB0aGUgU29mdHdhcmUgd2l0aG91dCByZXN0cmljdGlvbiwgaW5jbHVkaW5nIHdpdGhvdXQgbGltaXRhdGlvbiB0aGUgcmlnaHRzCnRvIHVzZSwgY29weSwgbW9kaWZ5LCBtZXJnZSwgcHVibGlzaCwgZGlzdHJpYnV0ZSwgc3VibGljZW5zZSwgYW5kL29yIHNlbGwKY29waWVzIG9mIHRoZSBTb2Z0d2FyZSwgYW5kIHRvIHBlcm1pdCBwZXJzb25zIHRvIHdob20gdGhlIFNvZnR3YXJlIGlzCmZ1cm5pc2hlZCB0byBkbyBzbywgc3ViamVjdCB0byB0aGUgZm9sbG93aW5nIGNvbmRpdGlvbnM6CgpUaGUgYWJvdmUgY29weXJpZ2h0IG5vdGljZSBhbmQgdGhpcyBwZXJtaXNzaW9uIG5vdGljZSBzaGFsbCBiZSBpbmNsdWRlZCBpbiBhbGwKY29waWVzIG9yIHN1YnN0YW50aWFsIHBvcnRpb25zIG9mIHRoZSBTb2Z0d2FyZS4KClRIRSBTT0ZUV0FSRSBJUyBQUk9WSURFRCAiQVMgSVMiLCBXSVRIT1VUIFdBUlJBTlRZIE9GIEFOWSBLSU5ELCBFWFBSRVNTIE9SCklNUExJRUQsIElOQ0xVRElORyBCVVQgTk9UIExJTUlURUQgVE8gVEhFIFdBUlJBTlRJRVMgT0YgTUVSQ0hBTlRBQklMSVRZLApGSVRORVNTIEZPUiBBIFBBUlRJQ1VMQVIgUFVSUE9TRSBBTkQgTk9OSU5GUklOR0VNRU5ULiBJTiBOTyBFVkVOVCBTSEFMTCBUSEUKQVVUSE9SUyBPUiBDT1BZUklHSFQgSE9MREVSUyBCRSBMSUFCTEUgRk9SIEFOWSBDTEFJTSwgREFNQUdFUyBPUiBPVEhFUgpMSUFCSUxJVFksIFdIRVRIRVIgSU4gQU4gQUNUSU9OIE9GIENPTlRSQUNULCBUT1JUIE9SIE9USEVSV0lTRSwgQVJJU0lORyBGUk9NLApPVVQgT0YgT1IgSU4gQ09OTkVDVElPTiBXSVRIIFRIRSBTT0ZUV0FSRSBPUiBUSEUgVVNFIE9SIE9USEVSIERFQUxJTkdTIElOIFRIRQpTT0ZUV0FSRS4K