SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 30713080 of 4002 papers

TitleStatusHype
On Sentence Representations for Propaganda Detection: From Handcrafted Features to Word Embeddings0
On the contribution of word embeddings to temporal relation classification0
On the Convergent Properties of Word Embedding Methods0
On the Correlation of Word Embedding Evaluation Metrics0
On the Cross-lingual Transferability of Contextualized Sense Embeddings0
On the Curious Case of _2 norm of Sense Embeddings0
On the effectiveness of feature set augmentation using clusters of word embeddings0
Convolutional Neural Network with Word Embeddings for Chinese Word SegmentationCode0
Bridging Vision and Language Spaces with Assignment PredictionCode0
Co-occurrences using Fasttext embeddings for word similarity tasks in UrduCode0
Show:102550
← PrevPage 308 of 401Next →

No leaderboard results yet.