SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11711180 of 4002 papers

TitleStatusHype
Towards Multi-Sense Cross-Lingual Alignment of Contextual EmbeddingsCode0
Unsupervised Transfer Learning in Multilingual Neural Machine Translation with Cross-Lingual Word Embeddings0
Evaluation of Morphological Embeddings for the Russian Language0
CNN-based Spoken Term Detection and Localization without Dynamic Programming0
A Comparison of Word2Vec, HMM2Vec, and PCA2Vec for Malware Classification0
Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning0
Overcoming Poor Word Embeddings with Word Definitions0
Lex2vec: making Explainable Word Embeddings via Lexical Resources0
CG-CNN: Self-Supervised Feature Extraction Through Contextual Guidance and Transfer Learning0
Spanish Biomedical and Clinical Language Embeddings0
Show:102550
← PrevPage 118 of 401Next →

No leaderboard results yet.