SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 491500 of 4002 papers

TitleStatusHype
CS-Embed at SemEval-2020 Task 9: The effectiveness of code-switched word embeddings for sentiment analysisCode0
Deep convolutional acoustic word embeddings using word-pair side informationCode0
Geological Inference from Textual Data using Word EmbeddingsCode0
Automatic Detection of Sexist Statements Commonly Used at the WorkplaceCode0
Dependency Sensitive Convolutional Neural Networks for Modeling Sentences and DocumentsCode0
Cross-lingual Models of Word Embeddings: An Empirical ComparisonCode0
A Neighbourhood-Aware Differential Privacy Mechanism for Static Word EmbeddingsCode0
Grammatical gender associations outweigh topical gender bias in crosslinguistic word embeddingsCode0
Cross-lingual Dependency Parsing with Unlabeled Auxiliary LanguagesCode0
Cross-Lingual BERT Transformation for Zero-Shot Dependency ParsingCode0
Show:102550
← PrevPage 50 of 401Next →

No leaderboard results yet.