SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28712880 of 4002 papers

TitleStatusHype
Learning Eligibility in Cancer Clinical Trials using Deep Neural NetworksCode0
Word sense induction using word embeddings and community detection in complex networks0
UnibucKernel: A kernel-based learning method for complex word identification0
Hierarchical Learning of Cross-Language Mappings through Distributed Vector Representations for CodeCode0
Enhanced Word Representations for Bridging Anaphora Resolution0
Neural Fine-Grained Entity Type Classification with Hierarchy-Aware LossCode0
Improving Optimization for Models With Continuous Symmetry Breaking0
The emergent algebraic structure of RNNs and embeddings in NLP0
Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase GenerationCode0
Concatenated Power Mean Word Embeddings as Universal Cross-Lingual Sentence RepresentationsCode0
Show:102550
← PrevPage 288 of 401Next →

No leaderboard results yet.