SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 671680 of 4002 papers

TitleStatusHype
Subword-based Cross-lingual Transfer of Embeddings from Hindi to Marathi and Nepali0
BL.Research at SemEval-2022 Task 1: Deep networks for Reverse Dictionary using embeddings and LSTM autoencodersCode0
Leveraging Three Types of Embeddings from Masked Language Models in Idiom Token Classification0
Unsupervised Mitigating Gender Bias by Character Components: A Case Study of Chinese Word Embedding0
Raccoons at SemEval-2022 Task 11: Leveraging Concatenated Word Embeddings for Named Entity Recognition0
Language Models for Code-switch Detection of te reo Māori and English in a Low-resource Setting0
Clinical Flair: A Pre-Trained Language Model for Spanish Clinical Natural Language ProcessingCode0
Uppsala University at SemEval-2022 Task 1: Can Foreign Entries Enhance an English Reverse Dictionary?0
TLDR at SemEval-2022 Task 1: Using Transformers to Learn Dictionaries and Representations0
An Empirical Study on the Fairness of Pre-trained Word Embeddings0
Show:102550
← PrevPage 68 of 401Next →

No leaderboard results yet.