SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23512375 of 4002 papers

TitleStatusHype
ChemBoost: A chemical language based approach for protein-ligand binding affinity predictionCode0
Word-like character n-gram embeddingCode0
Interpreting Word-Level Hidden State Behaviour of Character-Level LSTM Language Models0
Importance of Self-Attention for Sentiment Analysis0
Frame- and Entity-Based Knowledge for Common-Sense Argumentative ReasoningCode0
A mostly unlexicalized model for recognizing textual entailment0
Interpretable Word Embedding Contextualization0
Macquarie University at BioASQ 6b: Deep learning and deep reinforcement learning for query-based summarisation0
Stance Detection in Fake News A Combined Feature Representation0
Robust Word Vectors: Context-Informed Embeddings for Noisy Texts0
Normalization of Transliterated Words in Code-Mixed Data Using Seq2Seq Model \& Levenshtein Distance0
Multilingual Embeddings Jointly Induced from Contexts and Concepts: Simple, Strong and Scalable0
GlobalTrait: Personality Alignment of Multilingual Word Embeddings0
Truly unsupervised acoustic word embeddings using weak top-down constraints in encoder-decoder modelsCode0
Learning Unsupervised Word Mapping by Maximizing Mean Discrepancy0
SIEVE: Helping Developers Sift Wheat from Chaff via Cross-Platform Analysis0
Measuring Issue Ownership using Word Embeddings0
Aligning Very Small Parallel Corpora Using Cross-Lingual Word Embeddings and a Monogamy Objective0
Attentive Neural Network for Named Entity Recognition in VietnameseCode0
Word Mover's Embedding: From Word2Vec to Document EmbeddingCode0
Subword Encoding in Lattice LSTM for Chinese Word SegmentationCode0
Magnitude: A Fast, Efficient Universal Vector Embedding Utility PackageCode0
Learning Emotion from 100 Observations: Unexpected Robustness of Deep Learning under Strong Data Limitations0
Local Homology of Word Embeddings0
Word Sense Induction using Knowledge Embeddings0
Show:102550
← PrevPage 95 of 161Next →

No leaderboard results yet.