Hello sir,
I have read the whole article and found it very useful. I want to know how each word in the vocabulary is mapped with word embedding. If word embedding for words are not present in the pretrained model, how such words are represented. Please explain me such concept. I donot think that word embedding for each word is present in word2vec pretrained model.
With Regards
Sanpreet Singh