SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 35013510 of 4002 papers

TitleStatusHype
Robust Gram EmbeddingsCode0
Effective Greedy Inference for Graph-based Non-Projective Dependency Parsing0
Learning principled bilingual mappings of word embeddings while preserving monolingual invarianceCode1
Context-Dependent Sense Embedding0
Learning Term Embeddings for Taxonomic Relation Identification Using Dynamic Weighting Neural Network0
A Graph Degeneracy-based Approach to Keyword ExtractionCode0
Exploring Semantic Representation in Brain Activity Using Word Embeddings0
Learning Connective-based Word Representations for Implicit Discourse Relation Identification0
SimpleScience: Lexical Simplification of Scientific Terminology0
Learning Sentence Embeddings with Auxiliary Tasks for Cross-Domain Sentiment Classification0
Show:102550
← PrevPage 351 of 401Next →

No leaderboard results yet.