SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28312840 of 4002 papers

TitleStatusHype
Retrofitting Word Representations for Unsupervised Sense Aware Word Similarities0
Comparing Pretrained Multilingual Word Embeddings on an Ontology Alignment TaskCode0
Social Image Tags as a Source of Word Embeddings: A Task-oriented Evaluation0
Word Embedding Evaluation Datasets and Wikipedia Title Embedding for Chinese0
Joint Learning of Sense and Word Embeddings0
Evaluation of Dictionary Creating Methods for Finno-Ugric Minority Languages0
Hierarchical Density Order EmbeddingsCode1
Factors Influencing the Surprising Instability of Word EmbeddingsCode0
Automated Detection of Adverse Drug Reactions in the Biomedical Literature Using Convolutional Neural Networks and Biomedical Word Embeddings0
DeepEmo: Learning and Enriching Pattern-Based Emotion RepresentationsCode0
Show:102550
← PrevPage 284 of 401Next →

No leaderboard results yet.