ICode9

精准搜索请尝试: 精确搜索
首页 > 其他分享> 文章详细

Getting Started with Word2Vec

2019-02-16 12:43:04  阅读:294  来源: 互联网

标签:Tomas word2vec Getting Started Part Mikolov Word2Vec Tutorial


Getting Started with Word2Vec

1. Source by Google

Project with Code: https://code.google.com/archive/p/word2vec/

Blog: Learning Meaning Behind Words

Paper:

  1. Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. Efficient Estimation of Word Representations in Vector Space. In Proceedings of Workshop at ICLR, 2013.
  2. Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean. Distributed Representations of Words and Phrases and their Compositionality. In Proceedings of NIPS, 2013.
  3. Tomas Mikolov, Wen-tau Yih, and Geoffrey Zweig. Linguistic Regularities in Continuous Space Word Representations. In Proceedings of NAACL HLT, 2013.
  4. Tomas Mikolov, Quoc V. Le, Ilya Sutskever. Exploiting Similarities among Languages for Machine Translation
  5. NIPS DeepLearning Workshop NN for Text by Tomas Mikolov and etc. https://docs.google.com/file/d/0B7XkCwpI5KDYRWRnd1RzWXQ2TWc/edit

2. Best explaination

Best explained with original models, optimizing methods, Back-propagation background and Word Embedding Visual Inspector

paper: word2vec Parameter Learning Explained

Slides: Word Embedding Explained and Visualized

Youtube Video: Word Embedding Explained and Visualized – word2vec and wevi

Demo: wevi: word embedding visual inspector

3. Word2Vec Tutorials

Word2Vec Tutorial by Chris McCormick

Chris McCormick http://mccormickml.com/

Note: skip over the usual introductory and abstract insights about Word2Vec, and get into more of the details

Word2Vec Tutorial – The Skip-Gram Model

Word2Vec Tutorial Part 2 – Negative Sampling

Alex Minnaar’s Tutorials

Alex Minnaar http://alexminnaar.com/

Word2Vec Tutorial Part I: The Skip-Gram Model

Word2Vec Tutorial Part II: The Continuous Bag-of-Words Model

4. Learning by Coding

Distributed Representations of Sentences and Documents http://nbviewer.jupyter.org/github/fbkarsdorp/doc2vec/blob/master/doc2vec.ipynb

An Anatomy of Key Tricks in word2vec project with examples http://nbviewer.jupyter.org/github/dolaameng/tutorials/blob/master/word2vec-abc/poc/pyword2vec_anatomy.ipynb

  1. Deep learning with word2vec and gensim, Part One
  2. Word2vec in Python, Part Two: Optimizing
  3. Parallelizing word2vec in Python, Part Three
  4. Gensim word2vec document: models.word2vec – Deep learning with word2vec
  5. Word2vec Tutorial by Radim Řehůřek (Note: Simple but very powerful tutorial for word2vec model training in gensim.)

5. Ohter Word2Vec Resources

Word2Vec Resources by Chris McCormick

Posted by TextProcessing

References

  1. https://textprocessing.org/getting-started-with-word2vec

标签:Tomas,word2vec,Getting,Started,Part,Mikolov,Word2Vec,Tutorial
来源: https://www.cnblogs.com/fengyubo/p/10387311.html

本站声明: 1. iCode9 技术分享网(下文简称本站)提供的所有内容,仅供技术学习、探讨和分享;
2. 关于本站的所有留言、评论、转载及引用,纯属内容发起人的个人观点,与本站观点和立场无关;
3. 关于本站的所有言论和文字,纯属内容发起人的个人观点,与本站观点和立场无关;
4. 本站文章均是网友提供,不完全保证技术分享内容的完整性、准确性、时效性、风险性和版权归属;如您发现该文章侵犯了您的权益,可联系我们第一时间进行删除;
5. 本站为非盈利性的个人网站,所有内容不会用来进行牟利,也不会利用任何形式的广告来间接获益,纯粹是为了广大技术爱好者提供技术内容和技术思想的分享性交流网站。

专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]

Copyright (C)ICode9.com, All Rights Reserved.

ICode9版权所有