SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 25812590 of 4002 papers

TitleStatusHype
Mumpitz at PARSEME Shared Task 2018: A Bidirectional LSTM for the Identification of Verbal Multiword Expressions0
Textual Aggression Detection through Deep Learning0
TRAC-1 Shared Task on Aggression Identification: IIT(ISM)@COLING'180
Aggressive Language Identification Using Word Embeddings and Sentiment FeaturesCode0
Word-Embedding based Content Features for Automated Oral Proficiency Scoring0
Sub-label dependencies for Neural Morphological Tagging -- The Joint Submission of University of Colorado and University of Helsinki for VarDial 20180
Gender Bias in Neural Natural Language ProcessingCode0
Clustering Prominent People and Organizations in Topic-Specific Text Corpora0
Resource-Size matters: Improving Neural Named Entity Recognition with Optimized Large CorporaCode0
Differentiable Perturb-and-Parse: Semi-Supervised Parsing with a Structured Variational Autoencoder0
Show:102550
← PrevPage 259 of 401Next →

No leaderboard results yet.