SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 661670 of 4002 papers

TitleStatusHype
End-to-End Text Classification via Image-based Embedding using Character-level NetworksCode0
Enhanced word embeddings using multi-semantic representation through lexical chainsCode0
Debiasing Convolutional Neural Networks via Meta OrthogonalizationCode0
Enriching Word Embeddings with Temporal and Spatial InformationCode0
Can language models learn analogical reasoning? Investigating training objectives and comparisons to human performanceCode0
Can We Use Word Embeddings for Enhancing Guarani-Spanish Machine Translation?Code0
Equalizing Gender Biases in Neural Machine Translation with Word Embeddings TechniquesCode0
ESTEEM: A Novel Framework for Qualitatively Evaluating and Visualizing Spatiotemporal Embeddings in Social MediaCode0
ETNLP: a visual-aided systematic approach to select pre-trained embeddings for a downstream taskCode0
Data-driven models and computational tools for neurolinguistics: a language technology perspectiveCode0
Show:102550
← PrevPage 67 of 401Next →

No leaderboard results yet.