SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 741750 of 4002 papers

TitleStatusHype
Dict2vec : Learning Word Embeddings using Lexical DictionariesCode0
Clustering Word Embeddings with Self-Organizing Maps. Application on LaRoSeDa -- A Large Romanian Sentiment Data SetCode0
A Robust Bias Mitigation Procedure Based on the Stereotype Content ModelCode0
A Method for Studying Semantic Construal in Grammatical Constructions with Interpretable Contextual Embedding SpacesCode0
CMCE at SemEval-2020 Task 1: Clustering on Manifolds of Contextualized Embeddings to Detect Historical Meaning ShiftsCode0
A Robust Hybrid Approach for Textual Document ClassificationCode0
Don't Settle for Average, Go for the Max: Fuzzy Sets and Max-Pooled Word VectorsCode0
End-to-end Recurrent Neural Network Models for Vietnamese Named Entity Recognition: Word-level vs. Character-levelCode0
A Robust Self-Learning Method for Fully Unsupervised Cross-Lingual Mappings of Word Embeddings: Making the Method Robustly Reproducible as WellCode0
Deep Learning for Hate Speech Detection in TweetsCode0
Show:102550
← PrevPage 75 of 401Next →

No leaderboard results yet.