SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12811290 of 4002 papers

TitleStatusHype
SHIKEBLCU at SemEval-2020 Task 2: An External Knowledge-enhanced Matrix for Multilingual and Cross-Lingual Lexical Entailment0
TUE at SemEval-2020 Task 1: Detecting Semantic Change by Clustering Contextual Word Embeddings0
A Review of Cross-Domain Text-to-SQL Models0
A Retrofitting Model for Incorporating Semantic Relations into Word Embeddings0
CLaC at SMM4H 2020: Birth Defect Mention Detection0
CLaC at SemEval-2020 Task 5: Muli-task Stacked Bi-LSTMs0
CitiusNLP at SemEval-2020 Task 3: Comparing Two Approaches for Word Vector Contextualization0
Multi-SimLex: A Large-Scale Evaluation of Multilingual and Crosslingual Lexical Semantic Similarity0
Graph-based Syntactic Word Embeddings0
Can Existing Methods Debias Languages Other than English? First Attempt to Analyze and Mitigate Japanese Word Embeddings0
Show:102550
← PrevPage 129 of 401Next →

No leaderboard results yet.