SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 25512560 of 4002 papers

TitleStatusHype
Investigating Effective Parameters for Fine-tuning of Word Embeddings Using Only a Small Corpus0
Investigating Gender Bias in BERT0
Investigating Graph Structure Information for Entity Alignment with Dangling Cases0
Investigating Language Universal and Specific Properties in Word Embeddings0
Investigating neural architectures for short answer scoring0
Investigating Sub-Word Embedding Strategies for the Morphologically Rich and Free Phrase-Order Hungarian0
Investigating the Effectiveness of Representations Based on Pretrained Transformer-based Language Models in Active Learning for Labelling Text Datasets0
Investigating the Stability of Concrete Nouns in Word Embeddings0
IRISA at SMM4H 2018: Neural Network and Bagging for Tweet Classification0
Is Neural Topic Modelling Better than Clustering? An Empirical Study on Clustering with Contextual Embeddings for Topics0
Show:102550
← PrevPage 256 of 401Next →

No leaderboard results yet.