SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26262650 of 4002 papers

TitleStatusHype
Natural Language Inference with Definition Embedding Considering Context On the Fly0
Multilingual Seq2seq Training with Similarity Loss for Cross-Lingual Document Classification0
Attention-based Semantic Priming for Slot-filling0
Joint learning of frequency and word embeddings for multilingual readability assessment0
A Sequence Learning Method for Domain-Specific Entity Linking0
WordNet EmbeddingsCode0
A Helping Hand: Transfer Learning for Deep Sentiment Analysis0
Searching for the X-Factor: Exploring Corpus Subjectivity for Word Embeddings0
Orthographic Features for Bilingual Lexicon Induction0
Named Entity Recognition With Parallel Recurrent Neural NetworksCode0
Unsupervised Learning of Distributional Relation Vectors0
Leveraging distributed representations and lexico-syntactic fixedness for token-level prediction of the idiomaticity of English verb-noun combinations0
A Multi-task Approach to Learning Multilingual Representations0
SuperNMT: Neural Machine Translation with Semantic Supersenses and Syntactic Supertags0
Illustrative Language Understanding: Large-Scale Visual Grounding with Image Search0
A Deep Relevance Model for Zero-Shot Document FilteringCode0
Using pseudo-senses for improving the extraction of synonyms from word embeddings0
Word Embedding and WordNet Based Metaphor Identification and Interpretation0
Incorporating Latent Meanings of Morphological Compositions to Enhance Word EmbeddingsCode0
Addressing Noise in Multidialectal Word Embeddings0
Batch IS NOT Heavy: Learning Word Representations From All Samples0
Two Methods for Domain Adaptation of Bilingual Tasks: Delightfully Simple and Broadly ApplicableCode0
Multi-lingual Entity Discovery and Linking0
Towards Understanding the Geometry of Knowledge Graph EmbeddingsCode0
Neural Sparse Topical Coding0
Show:102550
← PrevPage 106 of 161Next →

No leaderboard results yet.