SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 941950 of 4002 papers

TitleStatusHype
AutoExtend: Extending Word Embeddings to Embeddings for Synsets and Lexemes0
Cross-Lingual Pronoun Prediction with Deep Recurrent Neural Networks v2.00
Cross-Lingual Suicidal-Oriented Word Embedding toward Suicide Prevention0
Cross-Lingual Syntactically Informed Distributed Word Representations0
BioReddit: Word Embeddings for User-Generated Biomedical NLP0
Cross-lingual Transfer for Unsupervised Dependency Parsing Without Parallel Data0
Cross-Lingual Transfer Learning for Hate Speech Detection0
Cross-Lingual Transfer Learning for POS Tagging without Cross-Lingual Resources0
Cross-lingual Transfer of Sentiment Classifiers0
Abstractive Document Summarization with Word Embedding Reconstruction0
Show:102550
← PrevPage 95 of 401Next →

No leaderboard results yet.