SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26312640 of 4002 papers

TitleStatusHype
Using Word Embedding for Cross-Language Plagiarism Detection0
UsingWord Embedding for Cross-Language Plagiarism Detection0
Using Word Embeddings for Automatic Query Expansion0
Using Word Embeddings for Bilingual Unsupervised WSD0
Using Word Embeddings for Improving Statistical Machine Translation of Phrasal Verbs0
Using Word Embeddings for Italian Crime News Categorization0
UsingWord Embeddings for Query Translation for Hindi to English Cross Language Information Retrieval0
Using Word Embeddings for Unsupervised Acronym Disambiguation0
Using Word Embeddings for Visual Data Exploration with Ontodia and Wikidata0
Using Word Embeddings in Twitter Election Classification0
Show:102550
← PrevPage 264 of 401Next →

No leaderboard results yet.