SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 30113020 of 4002 papers

TitleStatusHype
Neural Networks Leverage Corpus-wide Information for Part-of-speech Tagging0
Neural Question Answering at BioASQ 5B0
Neural Scoring Function for MST Parser0
Neural sequence labeling for Vietnamese POS Tagging and NER0
Neural Sparse Topical Coding0
Neural-Symbolic Relational Reasoning on Graph Models: Effective Link Inference and Computation from Knowledge Bases0
Neural Text Classification by Jointly Learning to Cluster and Align0
Neural Text Simplification in Low-Resource Conditions Using Weak Supervision0
Neural word embeddings with multiplicative feature interactions for tensor-based compositions0
NEUROSENT-PDI at SemEval-2018 Task 1: Leveraging a Multi-Domain Sentiment Model for Inferring Polarity in Micro-blog Text0
Show:102550
← PrevPage 302 of 401Next →

No leaderboard results yet.