SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11511160 of 4002 papers

TitleStatusHype
DFKI-MLT System Description for the WMT18 Automatic Post-editing Task0
Development of Word Embeddings for Uzbek Language0
Discourse Relation Sense Classification Using Cross-argument Semantic Similarity Based on Word Embeddings0
An exploration of the encoding of grammatical gender in word embeddings0
Bilexical Embeddings for Quality Estimation0
Discovering Bilingual Lexicons in Polyglot Word Embeddings0
Development of a Japanese Personality Dictionary based on Psychological Methods0
Bilingual Autoencoders with Global Descriptors for Modeling Parallel Sentences0
Discovering linguistic (ir)regularities in word embeddings through max-margin separating hyperplanes0
Developing Conversational Data and Detection of Conversational Humor in Telugu0
Show:102550
← PrevPage 116 of 401Next →

No leaderboard results yet.