SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 38113820 of 4002 papers

TitleStatusHype
Gaussian LDA for Topic Models with Word EmbeddingsCode0
GCDT: A Global Context Enhanced Deep Transition Architecture for Sequence LabelingCode0
Character Composition Model with Convolutional Neural Networks for Dependency Parsing on Morphologically Rich LanguagesCode0
The Shape of Word Embeddings: Quantifying Non-Isometry With Topological Data AnalysisCode0
Putting Evaluation in Context: Contextual Embeddings Improve Machine Translation EvaluationCode0
Putting words in context: LSTM language models and lexical ambiguityCode0
Centroid-based Text Summarization through Compositionality of Word EmbeddingsCode0
Gender Bias in Neural Natural Language ProcessingCode0
A Robust Bias Mitigation Procedure Based on the Stereotype Content ModelCode0
Sobolev Transport: A Scalable Metric for Probability Measures with Graph MetricsCode0
Show:102550
← PrevPage 382 of 401Next →

No leaderboard results yet.