SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 901925 of 4002 papers

TitleStatusHype
Membership Inference on Word Embedding and Beyond0
WG4Rec: Modeling Textual Content with Word Graph for News RecommendationCode0
An Improved Single Step Non-autoregressive Transformer for Automatic Speech Recognition0
Improving Entity Linking through Semantic Reinforced Entity EmbeddingsCode1
Do Acoustic Word Embeddings Capture Phonological Similarity? An Empirical StudyCode0
PairConnect: A Compute-Efficient MLP Alternative to Attention0
Semantic Representation and Inference for NLP0
Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention SelectionCode0
Shape of Elephant: Study of Macro Properties of Word Embeddings Spaces0
Predicting the Ordering of Characters in Japanese Historical Documents0
CogAlign: Learning to Align Textual Neural Representations to Cognitive Language Processing SignalsCode0
Low-Dimensional Structure in the Space of Language Representations is Reflected in Brain ResponsesCode0
Case Studies on using Natural Language Processing Techniques in Customer Relationship Management Software0
Obtaining Better Static Word Embeddings Using Contextual Embedding ModelsCode1
Combining Static Word Embeddings and Contextual Representations for Bilingual Lexicon InductionCode1
Denoising Word Embeddings by Averaging in a Shared Space0
A General Method for Event Detection on Social Media0
Evaluating Word Embeddings with Categorical ModularityCode0
Looking for a Role for Word Embeddings in Eye-Tracking Features Prediction: Does Semantic Similarity Help?0
Experiments on a Guarani Corpus of News and Social Media0
Leveraging English Word Embeddings for Semi-Automatic Semantic Classification in Nêhiyawêwin (Plains Cree)0
CogNLP-Sheffield at CMCL 2021 Shared Task: Blending Cognitively Inspired Features with Transformer-based Language Models for Predicting Eye Tracking Patterns0
Sentence Complexity in Context0
Non-Complementarity of Information in Word-Embedding and Brain Representations in Distinguishing between Concrete and Abstract Words0
NARNIA at NLP4IF-2021: Identification of Misinformation in COVID-19 Tweets Using BERTweet0
Show:102550
← PrevPage 37 of 161Next →

No leaderboard results yet.