SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20262050 of 4002 papers

TitleStatusHype
Quantifying Context Overlap for Training Word Embeddings0
Quantifying the vanishing gradient and long distance dependency problem in recursive neural networks and recursive LSTMs0
Quantum Algorithms for Compositional Text Processing0
Quantum-inspired Complex Word Embedding0
Query Expansion for Cross-Language Question Re-Ranking0
Query Expansion with Locally-Trained Word Embeddings0
Querying Word Embeddings for Similarity and Relatedness0
Query Obfuscation by Semantic Decomposition0
Query Translation for Cross-Language Information Retrieval using Multilingual Word Clusters0
Question Answering over Freebase with Multi-Column Convolutional Neural Networks0
Questioning Arbitrariness in Language: a Data-Driven Study of Conventional Iconicity0
Question Type Classification Methods Comparison0
Raccoons at SemEval-2022 Task 11: Leveraging Concatenated Word Embeddings for Named Entity Recognition0
Racial Bias Trends in the Text of US Legal Opinions0
Random Decision Syntax Trees at SemEval-2018 Task 3: LSTMs and Sentiment Scores for Irony Detection0
Random Positive-Only Projections: PPMI-Enabled Incremental Semantic Space Construction0
Random Walks and Neural Network Language Models on Knowledge Bases0
Ranking Convolutional Recurrent Neural Networks for Purchase Stage Identification on Imbalanced Twitter Data0
Ranking Kernels for Structures and Embeddings: A Hybrid Preference and Classification Model0
RankMat : Matrix Factorization with Calibrated Distributed Embedding and Fairness Enhancement0
Rare Tokens Degenerate All Tokens: Improving Neural Text Generation via Adaptive Gradient Gating for Rare Token Embeddings0
RAW-C: Relatedness of Ambiguous Words in Context (A New Lexical Resource for English)0
Reading Between the Lines: Overcoming Data Sparsity for Accurate Classification of Lexical Relationships0
Realised Volatility Forecasting: Machine Learning via Financial Word Embedding0
Real Multi-Sense or Pseudo Multi-Sense: An Approach to Improve Word Representation0
Show:102550
← PrevPage 82 of 161Next →

No leaderboard results yet.