SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13311340 of 4002 papers

TitleStatusHype
Aligning Open IE Relations and KB Relations using a Siamese Network Based on Word Embedding0
End-to-End Entity Linking and Disambiguation leveraging Word and Knowledge Graph Embeddings0
Active Discriminative Text Representation Learning0
Exploring Word Embeddings for Unsupervised Textual User-Generated Content Normalization0
Captioning Images with Novel Objects via Online Vocabulary Expansion0
ENGLAWI: From Human- to Machine-Readable Wiktionary0
English-Malay Cross-Lingual Embedding Alignment using Bilingual Lexicon Augmentation0
English-Malay Word Embeddings Alignment for Cross-lingual Emotion Classification with Hierarchical Attention Network0
English Resource Semantics0
Estimating word co-occurrence probabilities from pretrained static embeddings using a log-bilinear model0
Show:102550
← PrevPage 134 of 401Next →

No leaderboard results yet.