SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 35813590 of 4002 papers

TitleStatusHype
A Survey of Word Embeddings Evaluation MethodsCode0
An Evaluation Dataset for Legal Word Embedding: A Case Study On Chinese CodexCode0
Evaluating Multilingual Text Encoders for Unsupervised Cross-Lingual RetrievalCode0
Learning Text Representations for 500K Classification Tasks on Named Entity DisambiguationCode0
Word-level Textual Adversarial Attacking as Combinatorial OptimizationCode0
Evaluating Neural Word Embeddings for SanskritCode0
SemRoDe: Macro Adversarial Training to Learn Representations That are Robust to Word-Level AttacksCode0
Unsupervised Keyphrase Extraction from Scientific PublicationsCode0
IdBench: Evaluating Semantic Representations of Identifier Names in Source CodeCode0
Evaluating shallow and deep learning strategies for the 2018 n2c2 shared task on clinical text classificationCode0
Show:102550
← PrevPage 359 of 401Next →

No leaderboard results yet.