SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20112020 of 4002 papers

TitleStatusHype
Probing Word and Sentence Embeddings for Long-distance Dependencies Effects in French and English0
Problems With Evaluation of Word Embeddings Using Word Similarity Tasks0
Protecting Copyright of Medical Pre-trained Language Models: Training-Free Backdoor Model Watermarking0
Pruning Basic Elements for Better Automatic Evaluation of Summaries0
PSDVec: a Toolbox for Incremental and Scalable Word Embedding0
PublishInCovid19 at WNUT 2020 Shared Task-1: Entity Recognition in Wet Lab Protocols using Structured Learning Ensemble and Contextualised Embeddings0
Punctuation Prediction in Spontaneous Conversations: Can We Mitigate ASR Errors with Retrofitted Word Embeddings?0
PurdueNLP at SemEval-2017 Task 1: Predicting Semantic Textual Similarity with Paraphrase and Event Embeddings0
PQLM -- Multilingual Decentralized Portable Quantum Language Model for Privacy Protection0
QLUT at SemEval-2017 Task 1: Semantic Textual Similarity Based on Word Embeddings0
Show:102550
← PrevPage 202 of 401Next →

No leaderboard results yet.