SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33613370 of 4002 papers

TitleStatusHype
Clinical Text Classification with Rule-based Features and Knowledge-guided Convolutional Neural Networks0
CLULEX at SemEval-2021 Task 1: A Simple System Goes a Long Way0
Clustering Comparable Corpora of Russian and Ukrainian Academic Texts: Word Embeddings and Semantic Fingerprints0
Clustering is Efficient for Approximate Maximum Inner Product Search0
Clustering of Russian Adjective-Noun Constructions using Word Embeddings0
Clustering Prominent People and Organizations in Topic-Specific Text Corpora0
Clustering Word Embeddings with Self-Organizing Maps. Application on LaRoSeDa - A Large Romanian Sentiment Data Set0
Cluster Labeling by Word Embeddings and WordNet's Hypernymy0
CNN- and LSTM-based Claim Classification in Online User Comments0
CNN-based Spoken Term Detection and Localization without Dynamic Programming0
Show:102550
← PrevPage 337 of 401Next →

No leaderboard results yet.