SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 471480 of 4002 papers

TitleStatusHype
Semeval-2022 Task 1: CODWOE – Comparing Dictionaries and Word Embeddings0
Minimally-Supervised Relation Induction from Pre-trained Language Model0
Plumeria at SemEval-2022 Task 6: Sarcasm Detection for English and Arabic Using Transformers and Data Augmentation0
Cross-Language Transfer of High-Quality Annotations: Combining Neural Machine Translation with Cross-Linguistic Span Alignment to Apply NER to Clinical Texts in a Low-Resource LanguageCode0
Raccoons at SemEval-2022 Task 11: Leveraging Concatenated Word Embeddings for Named Entity Recognition0
Unsupervised Mitigating Gender Bias by Character Components: A Case Study of Chinese Word Embedding0
Team Stanford ACMLab at SemEval 2022 Task 4: Textual Analysis of PCL Using Contextual Word Embeddings0
Uppsala University at SemEval-2022 Task 1: Can Foreign Entries Enhance an English Reverse Dictionary?0
Language Models for Code-switch Detection of te reo Māori and English in a Low-resource Setting0
An Empirical Study on the Fairness of Pre-trained Word Embeddings0
Show:102550
← PrevPage 48 of 401Next →

No leaderboard results yet.