SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 18111820 of 4002 papers

TitleStatusHype
Meemi: A Simple Method for Post-processing and Integrating Cross-lingual Word Embeddings0
A Probabilistic Framework for Learning Domain Specific Hierarchical Word Embeddings0
Joint Learning of Word and Label Embeddings for Sequence Labelling in Spoken Language Understanding0
BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model PerformanceCode0
Mapping Supervised Bilingual Word Embeddings from English to low-resource languagesCode0
Feature-Dependent Confusion Matrices for Low-Resource NER Labeling with Noisy LabelsCode0
Transformers without Tears: Improving the Normalization of Self-AttentionCode0
From the Paft to the Fiiture: a Fully Automatic NMT and Word Embeddings Method for OCR Post-CorrectionCode0
IdBench: Evaluating Semantic Representations of Identifier Names in Source CodeCode0
Structured Pruning of Large Language ModelsCode1
Show:102550
← PrevPage 182 of 401Next →

No leaderboard results yet.