SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23512360 of 4002 papers

TitleStatusHype
ChemBoost: A chemical language based approach for protein-ligand binding affinity predictionCode0
Word-like character n-gram embeddingCode0
Interpreting Word-Level Hidden State Behaviour of Character-Level LSTM Language Models0
Importance of Self-Attention for Sentiment Analysis0
Frame- and Entity-Based Knowledge for Common-Sense Argumentative ReasoningCode0
A mostly unlexicalized model for recognizing textual entailment0
Interpretable Word Embedding Contextualization0
Macquarie University at BioASQ 6b: Deep learning and deep reinforcement learning for query-based summarisation0
Stance Detection in Fake News A Combined Feature Representation0
Robust Word Vectors: Context-Informed Embeddings for Noisy Texts0
Show:102550
← PrevPage 236 of 401Next →

No leaderboard results yet.