SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 29512960 of 4002 papers

TitleStatusHype
SemRe-Rank: Improving Automatic Term Extraction By Incorporating Semantic Relatedness With Personalised PageRankCode1
Evaluation of Croatian Word EmbeddingsCode0
Learning Word Embeddings from Speech0
Fine-tuning Tree-LSTM for phrase-level sentiment classification on a Polish dependency treebank. Submission to PolEval task 2Code0
Compressing Word Embeddings via Deep Compositional Code LearningCode0
應用詞向量於語言樣式探勘之研究 (Mining Language Patterns Using Word Embeddings) [In Chinese]0
Towards Lower Bounds on Number of Dimensions for Word Embeddings0
Event Ordering with a Generalized Model for Sieve Prediction Ranking0
Demographic Word Embeddings for Racism Detection on Twitter0
Semantic Features Based on Word Alignments for Estimating Quality of Text Simplification0
Show:102550
← PrevPage 296 of 401Next →

No leaderboard results yet.