SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 10511060 of 4002 papers

TitleStatusHype
How COVID-19 Is Changing Our Language : Detecting Semantic Shift in Twitter Word Embeddings0
OntoZSL: Ontology-enhanced Zero-shot LearningCode1
Content-Aware Speaker Embeddings for Speaker Diarisation0
Points2Vec: Unsupervised Object-level Feature Learning from Point Clouds0
A study of text representations in Hate Speech DetectionCode0
A Note on Argumentative Topology: Circularity and Syllogisms as Unsolved Problems0
Focusing Knowledge-based Graph Argument Mining via Topic Modeling0
Bootstrapping Multilingual AMR with Contextual Word Alignments0
Using Word Embeddings to Uncover Discourses0
Short Text Clustering with Transformers0
Show:102550
← PrevPage 106 of 401Next →

No leaderboard results yet.