SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26412650 of 4002 papers

TitleStatusHype
A Deep Relevance Model for Zero-Shot Document FilteringCode0
Using pseudo-senses for improving the extraction of synonyms from word embeddings0
Word Embedding and WordNet Based Metaphor Identification and Interpretation0
Incorporating Latent Meanings of Morphological Compositions to Enhance Word EmbeddingsCode0
Addressing Noise in Multidialectal Word Embeddings0
Batch IS NOT Heavy: Learning Word Representations From All Samples0
Two Methods for Domain Adaptation of Bilingual Tasks: Delightfully Simple and Broadly ApplicableCode0
Multi-lingual Entity Discovery and Linking0
Towards Understanding the Geometry of Knowledge Graph EmbeddingsCode0
Neural Sparse Topical Coding0
Show:102550
← PrevPage 265 of 401Next →

No leaderboard results yet.