SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 901910 of 4002 papers

TitleStatusHype
Deep Learning Models in Detection of Dietary Supplement Adverse Event Signals from Twitter0
WG4Rec: Modeling Textual Content with Word Graph for News RecommendationCode0
An Improved Single Step Non-autoregressive Transformer for Automatic Speech Recognition0
Improving Entity Linking through Semantic Reinforced Entity EmbeddingsCode1
Do Acoustic Word Embeddings Capture Phonological Similarity? An Empirical StudyCode0
Semantic Representation and Inference for NLP0
Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention SelectionCode0
PairConnect: A Compute-Efficient MLP Alternative to Attention0
Shape of Elephant: Study of Macro Properties of Word Embeddings Spaces0
Predicting the Ordering of Characters in Japanese Historical Documents0
Show:102550
← PrevPage 91 of 401Next →

No leaderboard results yet.