SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 301310 of 4002 papers

TitleStatusHype
Adjusting Word Embeddings with Semantic Intensity Orders0
Analyzing autoencoder-based acoustic word embeddings0
A Comparative Study of Neural Network Models for Sentence Classification0
4chan & 8chan embeddings0
Argument from Old Man’s View: Assessing Social Bias in Argumentation0
Artificial intelligence prediction of stock prices using social media0
A Simple and Efficient Probabilistic Language model for Code-Mixed Text0
Analyzing Acoustic Word Embeddings from Pre-trained Self-supervised Speech Models0
A Distribution-based Model to Learn Bilingual Word Embeddings0
A Comparative Study of Embedding Models in Predicting the Compositionality of Multiword Expressions0
Show:102550
← PrevPage 31 of 401Next →

No leaderboard results yet.