SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 38013825 of 4002 papers

TitleStatusHype
Specializing Word Embeddings for Similarity or Relatedness0
Sentence Compression by Deletion with LSTMs0
Pre-Computable Multi-Layer Neural Network Language Models0
Using reading behavior to predict grammatical functions0
Exploiting Debate Portals for Semi-Supervised Argumentation Mining in User-Generated Web DiscourseCode0
Semi-supervised Dependency Parsing using Bilexical Contextual Features from Auto-Parsed Data0
Exploring Word Embedding for Drug Name Recognition0
Syntactic Dependencies and Distributed Word Representations for Analogy Detection and Mining0
Towards a Model of Prediction-based Syntactic Category Acquisition: First Steps with Word Embeddings0
Convolutional Sentence Kernel from Word Embeddings for Short Text Categorization0
Sarcastic or Not: Word Embeddings to Predict the Literal or Sarcastic Meaning of Words0
Any-language frame-semantic parsing0
What's in an Embedding? Analyzing Word Embeddings through Multilingual Evaluation0
Comparing Word Representations for Implicit Discourse Relation Classification0
Empty Category Detection using Path Features and Distributed Case Frames0
Distributed Representations for Unsupervised Semantic Role Labeling0
Improving evaluation and optimization of MT systems against MEANT0
Component-Enhanced Chinese Character Embeddings0
Better Summarization Evaluation with Word Embeddings for ROUGECode0
Learning Meta-Embeddings by Using Ensembles of Embedding SetsCode0
Syntax-Aware Multi-Sense Word Embeddings for Deep Compositional Models of Meaning0
Cross-Lingual Dependency Parsing with Universal Dependencies and Predicted PoS Labels0
Document Embedding with Paragraph VectorsCode0
Reasoning about Linguistic Regularities in Word Embeddings using Matrix Manifolds0
Clustering is Efficient for Approximate Maximum Inner Product Search0
Show:102550
← PrevPage 153 of 161Next →

No leaderboard results yet.