SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 30913100 of 4002 papers

TitleStatusHype
Correlations between Word Vector SetsCode0
CoSimLex: A Resource for Evaluating Graded Word Similarity in ContextCode0
Streaming Word Embeddings with the Space-Saving AlgorithmCode0
Breaking the Silence Detecting and Mitigating Gendered Abuse in Hindi, Tamil, and Indian English Online SpacesCode0
Contextually Propagated Term Weights for Document RepresentationCode0
Multilingual Irony Detection with Dependency Syntax and Neural ModelsCode0
Automated Generation of Multilingual Clusters for the Evaluation of Distributed RepresentationsCode0
Creative Contextual Dialog Adaptation in an Open World RPGCode0
Improving Lexical Choice in Neural Machine TranslationCode0
Improving Lexical Embeddings with Semantic KnowledgeCode0
Show:102550
← PrevPage 310 of 401Next →

No leaderboard results yet.