SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 24912500 of 4002 papers

TitleStatusHype
AWE: Asymmetric Word Embedding for Textual Entailment0
Unsupervised Cross-lingual Transfer of Word Embedding SpacesCode0
xSense: Learning Sense-Separated Sparse Representations and Textual Definitions for Explainable Word Sense NetworksCode0
SHOMA at Parseme Shared Task on Automatic Identification of VMWEs: Neural Multiword Expression Tagging with High GeneralisationCode0
Exploration on Grounded Word Embedding: Matching Words and Images with Image-Enhanced Skip-Gram Model0
Unsupervised Cross-lingual Word Embedding by Multilingual Neural Language Models0
Learning Embeddings of Directed Networks with Text-Associated Nodes---with Applications in Software Package Dependency Networks0
Uncovering divergent linguistic information in word embeddings with lessons for intrinsic and extrinsic evaluationCode0
An Analysis of Hierarchical Text Classification Using Word Embeddings0
Sentylic at IEST 2018: Gated Recurrent Neural Network and Capsule Network Based Approach for Implicit Emotion Detection0
Show:102550
← PrevPage 250 of 401Next →

No leaderboard results yet.