Sanpreet Singh
1 min readJul 25, 2019

Hello sir,
I have read the whole article and found it very useful. I want to know how each word in the vocabulary is mapped with word embedding. If word embedding for words are not present in the pretrained model, how such words are represented. Please explain me such concept. I donot think that word embedding for each word is present in word2vec pretrained model.

With Regards
Sanpreet Singh

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

No responses yet

Write a response