Description
Word embeddings (like GloVe, fastText and word2vec) are very powerful for capturing general word semantics. What if your use case is domain specific? Will your embeddings still work? If they don’t, how do you retrain them?
Word embeddings (like GloVe, fastText and word2vec) are very powerful for capturing general word semantics. What if your use case is domain specific? Will your embeddings still work? If they don’t, how do you retrain them?