SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 38013825 of 4002 papers

TitleStatusHype
Sentence Compression by Deletion with LSTMs0
Distributed Representations for Unsupervised Semantic Role Labeling0
Semi-supervised Dependency Parsing using Bilexical Contextual Features from Auto-Parsed Data0
Semi-Supervised Bootstrapping of Relationship Extractors with Distributional Semantics0
Sarcastic or Not: Word Embeddings to Predict the Literal or Sarcastic Meaning of Words0
Convolutional Sentence Kernel from Word Embeddings for Short Text Categorization0
Translation Invariant Word Embeddings0
Comparing Word Representations for Implicit Discourse Relation Classification0
Reinforcing the Topic of Embeddings with Theta Pure Dependence for Text Classification0
Bilingual Correspondence Recursive Autoencoder for Statistical Machine Translation0
Pre-Computable Multi-Layer Neural Network Language Models0
What's in an Embedding? Analyzing Word Embeddings through Multilingual Evaluation0
Any-language frame-semantic parsing0
A Model of Zero-Shot Learning of Spoken Language Understanding0
Neural Networks for Open Domain Targeted SentimentCode0
Online Learning of Interpretable Word EmbeddingsCode0
Component-Enhanced Chinese Character Embeddings0
Better Summarization Evaluation with Word Embeddings for ROUGECode0
Learning Meta-Embeddings by Using Ensembles of Embedding SetsCode0
Syntax-Aware Multi-Sense Word Embeddings for Deep Compositional Models of Meaning0
Cross-Lingual Dependency Parsing with Universal Dependencies and Predicted PoS Labels0
Document Embedding with Paragraph VectorsCode0
Reasoning about Linguistic Regularities in Word Embeddings using Matrix Manifolds0
Clustering is Efficient for Approximate Maximum Inner Product Search0
How to Generate a Good Word Embedding?Code0
Show:102550
← PrevPage 153 of 161Next →

No leaderboard results yet.