SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 29512960 of 4002 papers

TitleStatusHype
Regionalized models for Spanish language variations based on Twitter0
A Latent Concept Topic Model for Robust Topic Inference Using Word Embeddings0
A Lexicalized Tree Kernel for Open Information Extraction0
Aligning Open IE Relations and KB Relations using a Siamese Network Based on Word Embedding0
Aligning Opinions: Cross-Lingual Opinion Mining with Dependencies0
Aligning Very Small Parallel Corpora Using Cross-Lingual Word Embeddings and a Monogamy Objective0
Aligning Visual Prototypes with BERT Embeddings for Few-Shot Learning0
Alignment-free Cross-lingual Semantic Role Labeling0
A Linear Dynamical System Model for Text0
A Linguistically Informed Convolutional Neural Network0
Show:102550
← PrevPage 296 of 401Next →

No leaderboard results yet.