SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33413350 of 4002 papers

TitleStatusHype
Chinese Hypernym-Hyponym Extraction from User Generated Categories0
Chinese Zero Pronoun Resolution with Deep Neural Networks0
CitiusNLP at SemEval-2020 Task 3: Comparing Two Approaches for Word Vector Contextualization0
City2City: Translating Place Representations across Cities0
CLaC at SemEval-2020 Task 5: Muli-task Stacked Bi-LSTMs0
CLaC at SMM4H 2020: Birth Defect Mention Detection0
CLaC Lab at SemEval-2019 Task 3: Contextual Emotion Detection Using a Combination of Neural Networks and SVM0
ClaiRE at SemEval-2018 Task 7: Classification of Relations using Embeddings0
Class-based Prediction Errors to Detect Hate Speech with Out-of-vocabulary Words0
Classification Attention for Chinese NER0
Show:102550
← PrevPage 335 of 401Next →

No leaderboard results yet.