SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 921930 of 4002 papers

TitleStatusHype
aueb.twitter.sentiment at SemEval-2016 Task 4: A Weighted Ensemble of SVMs for Twitter Sentiment Analysis0
Cross-language Learning with Adversarial Neural Networks0
Cross-Language Question Re-Ranking0
Analyzing the Representational Geometry of Acoustic Word Embeddings0
BioReddit: Word Embeddings for User-Generated Biomedical NLP0
Cross-lingual alignments of ELMo contextual embeddings0
Advancing Fake News Detection: Hybrid DeepLearning with FastText and Explainable AI0
A bag-of-concepts model improves relation extraction in a narrow knowledge domain with limited data0
Augmenting Small Data to Classify Contextualized Dialogue Acts for Exploratory Visualization0
Abstractive Document Summarization with Word Embedding Reconstruction0
Show:102550
← PrevPage 93 of 401Next →

No leaderboard results yet.