SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 661670 of 4002 papers

TitleStatusHype
A Comparative Study on Word Embeddings and Social NLP TasksCode0
Unsupervised Mitigating Gender Bias by Character Components: A Case Study of Chinese Word Embedding0
Uppsala University at SemEval-2022 Task 1: Can Foreign Entries Enhance an English Reverse Dictionary?0
Leveraging Three Types of Embeddings from Masked Language Models in Idiom Token Classification0
Analysis of Gender Bias in Social Perception and Judgement Using Chinese Word Embeddings0
Indigenous Language Revitalization and the Dilemma of Gender Bias0
TurkishDelightNLP: A Neural Turkish NLP ToolkitCode0
Semeval-2022 Task 1: CODWOE – Comparing Dictionaries and Word Embeddings0
Language Models for Code-switch Detection of te reo Māori and English in a Low-resource Setting0
Edinburgh at SemEval-2022 Task 1: Jointly Fishing for Word Embeddings and DefinitionsCode0
Show:102550
← PrevPage 67 of 401Next →

No leaderboard results yet.