SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12611270 of 4002 papers

TitleStatusHype
Methods for Numeracy-Preserving Word Embeddings0
Pre-tokenization of Multi-word Expressions in Cross-lingual Word Embeddings0
Span-based discontinuous constituency parsing: a family of exact chart-based algorithms with time complexities from O(n\^6) down to O(n\^3)0
Unsupervised Cross-Lingual Part-of-Speech Tagging for Truly Low-Resource Scenarios0
WLV-RIT at HASOC-Dravidian-CodeMix-FIRE2020: Offensive Language Identification in Code-switched YouTube Comments0
Evaluating Bias In Dutch Word EmbeddingsCode0
Multimodal Metric Learning for Tag-based Music RetrievalCode1
"Thy algorithm shalt not bear false witness": An Evaluation of Multiclass Debiasing Methods on Word EmbeddingsCode0
Emotion Understanding in Videos Through Body, Context, and Visual-Semantic Embedding LossCode1
A Cross-lingual Natural Language Processing Framework for Infodemic Management0
Show:102550
← PrevPage 127 of 401Next →

No leaderboard results yet.