SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22112220 of 4002 papers

TitleStatusHype
Zero-Shot Cross-Lingual Opinion Target Extraction0
Evaluating the Underlying Gender Bias in Contextualized Word Embeddings0
Analytical Methods for Interpretable Ultradense Word EmbeddingsCode0
MoralStrength: Exploiting a Moral Lexicon and Embedding Similarity for Moral Foundations PredictionCode0
Contextual Aware Joint Probability Model Towards Question Answering System0
Query Expansion for Cross-Language Question Re-Ranking0
Text2Node: a Cross-Domain System for Mapping Arbitrary Phrases to a Taxonomy0
Better Word Embeddings by Disentangling Contextual n-Gram InformationCode0
Detecting Cybersecurity Events from Noisy Short Text0
What's in a Name? Reducing Bias in Bios without Access to Protected Attributes0
Show:102550
← PrevPage 222 of 401Next →

No leaderboard results yet.