SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 491500 of 4002 papers

TitleStatusHype
Self-Supervised Acoustic Word Embedding Learning via Correspondence Transformer Encoder0
A Topical Approach to Capturing Customer Insight In Social Media0
Convolutional Neural Networks for Sentiment Analysis on Weibo Data: A Natural Language Processing Approach0
Bidirectional Attention as a Mixture of Continuous Word ExpertsCode0
Evaluating Biased Attitude Associations of Language Models in an Intersectional ContextCode0
Undecimated Wavelet Transform for Word Embedded Semantic Marginal Autoencoder in Security improvement and Denoising different Languages0
Leveraging multilingual transfer for unsupervised semantic acoustic word embeddings0
Conceptual Cognitive Maps Formation with Neural Successor Networks and Word Embeddings0
Racial Bias Trends in the Text of US Legal Opinions0
Social World Knowledge: Modeling and Applications0
Show:102550
← PrevPage 50 of 401Next →

No leaderboard results yet.