SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 881890 of 4002 papers

TitleStatusHype
On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual LearningCode0
On Dimensional Linguistic Properties of the Word Embedding SpaceCode0
Better Summarization Evaluation with Word Embeddings for ROUGECode0
Contextual String Embeddings for Sequence LabelingCode0
Deep Pivot-Based Modeling for Cross-language Cross-domain Transfer with Minimal GuidanceCode0
One Word, Two Sides: Traces of Stance in Contextualized Word RepresentationsCode0
Analytical Methods for Interpretable Ultradense Word EmbeddingsCode0
Contrastive Learning in Distilled ModelsCode0
Contrastive Loss is All You Need to Recover Analogies as Parallel LinesCode0
A Bi-Encoder LSTM Model For Learning Unstructured DialogsCode0
Show:102550
← PrevPage 89 of 401Next →

No leaderboard results yet.