Why is Word Embeddings Important for NLP?
You may be familiar with Word2vec (Mikolov et al., 2013). Word2vec allows you to add and subtract words as though they are capturing the meaning of a word. For example, …
Doing Aho Things
You may be familiar with Word2vec (Mikolov et al., 2013). Word2vec allows you to add and subtract words as though they are capturing the meaning of a word. For example, …
The information of word embedding is on the GitHub. Introduction word embeddings is a technique that expresses a word as a real number vector of low dimension (about 200 dimensions …
Introduction At November 29, TensorFlow 0.12 was released. One of the functions is visualization of embedded representation. This makes it possible to analyze high dimensional data interactively. The following is …
Introduction Do you have Business Cards? In Japanese business customs(I’m Japanese), Business cards(called “Meishi”) are essential. But if there are a lot of business cards, it is annoying to manage. …
Introduction When you are working, you have browsed information that is not relevant to your work, haven’t you? I feel awkward when my boss is creeping behind. Of course, I …