SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13011310 of 4002 papers

TitleStatusHype
Cross-Lingual Classification of Topics in Political Texts0
Augmenting Small Data to Classify Contextualized Dialogue Acts for Exploratory Visualization0
Advancing Fake News Detection: Hybrid DeepLearning with FastText and Explainable AI0
A comparative study of word embeddings and other features for lexical complexity detection in French0
A bag-of-concepts model improves relation extraction in a narrow knowledge domain with limited data0
Cross-lingual alignments of ELMo contextual embeddings0
Augmenting NLP models using Latent Feature Interpolations0
Cross-Language Question Re-Ranking0
Analyzing the Representational Geometry of Acoustic Word Embeddings0
Cross-language Learning with Adversarial Neural Networks0
Show:102550
← PrevPage 131 of 401Next →

No leaderboard results yet.