SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27612770 of 4002 papers

TitleStatusHype
Leveraging Linguistically Enhanced Embeddings for Open Information Extraction0
Leveraging Linguistic Resources for Improving Neural Text Classification0
Leveraging multilingual transfer for unsupervised semantic acoustic word embeddings0
Leveraging Pretrained Image-text Models for Improving Audio-Visual Learning0
Leveraging Pretrained Word Embeddings for Part-of-Speech Tagging of Code Switching Data0
Leveraging Semantic and Sentiment Knowledge for User-Generated Text Sentiment Classification0
Leveraging Three Types of Embeddings from Masked Language Models in Idiom Token Classification0
Leveraging Word Embeddings for Spoken Document Summarization0
Lex2vec: making Explainable Word Embeddings via Lexical Resources0
Lex-BERT: Enhancing BERT based NER with lexicons0
Show:102550
← PrevPage 277 of 401Next →

No leaderboard results yet.