SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 34313440 of 4002 papers

TitleStatusHype
Word Embeddings and Convolutional Neural Network for Arabic Sentiment ClassificationCode0
Machine Translation Evaluation for Arabic using Morphologically-enriched Embeddings0
Knowledge-Driven Event Embedding for Stock Prediction0
Bad Company---Neighborhoods in Neural Embedding Spaces Considered HarmfulCode0
A Deep Fusion Model for Domain Adaptation in Phrase-based MT0
Weighted Neural Bag-of-n-grams Model: New Baselines for Text ClassificationCode0
CharNER: Character-Level Named Entity RecognitionCode0
Semantic Annotation Aggregation with Conditional Crowdsourcing Models and Word Embeddings0
Borrow a Little from your Rich Cousin: Using Embeddings and Polarities of English Words for Multilingual Sentiment Classification0
Facing the most difficult case of Semantic Role Labeling: A collaboration of word embeddings and co-training0
Show:102550
← PrevPage 344 of 401Next →

No leaderboard results yet.