SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 19711980 of 4002 papers

TitleStatusHype
En-Ar Bilingual Word Embeddings without Word Alignment: Factors Effects0
Semantic Change in the Language of UK Parliamentary Debates0
RNN Embeddings for Identifying Difficult to Understand Medical WordsCode0
Derivational Morphological Relations in Word Embeddings0
Unsupervised Compositional Translation of Multiword Expressions0
Investigating Sub-Word Embedding Strategies for the Morphologically Rich and Free Phrase-Order Hungarian0
Improving Word Embeddings Using Kernel PCA0
Learning Word Embeddings without Context Vectors0
The Role of Protected Class Word Lists in Bias Identification of Contextualized Word Representations0
What does Neural Bring? Analysing Improvements in Morphosyntactic Annotation and Lemmatisation of Slovenian, Croatian and Serbian0
Show:102550
← PrevPage 198 of 401Next →

No leaderboard results yet.