SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20012010 of 4002 papers

TitleStatusHype
Using Word Embeddings to Examine Gender Bias in Dutch Newspapers, 1950-1990Code0
Exploring sentence informativeness0
Understanding Neural Machine Translation by Simplification: The Case of Encoder-free Models0
Analysis of Word Embeddings Using Fuzzy Clustering0
Differentiable Disentanglement Filter: an Application Agnostic Core Concept Discovery Probe0
Multimodal deep networks for text and image-based document classificationCode0
The Dynamic Embedded Topic ModelCode0
Topic Modeling in Embedding SpacesCode1
An Intrinsic Nearest Neighbor Analysis of Neural Machine Translation Architectures0
Improving Chemical Named Entity Recognition in Patents with Contextualized Word EmbeddingsCode0
Show:102550
← PrevPage 201 of 401Next →

No leaderboard results yet.