SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26112620 of 4002 papers

TitleStatusHype
Using BERT Embeddings to Model Word Importance in Conversational Transcripts for Deaf and Hard of Hearing Users0
Using bilingual word-embeddings for multilingual collocation extraction0
Using Centroids of Word Embeddings and Word Mover's Distance for Biomedical Document Retrieval in Question Answering0
Using Company Specific Headlines and Convolutional Neural Networks to Predict Stock Fluctuations0
Using contextual and cross-lingual word embeddings to improve variety in template-based NLG for automated journalism0
Using Convolution Neural Network with BERT for Stance Detection in Vietnamese0
How Can BERT Help Lexical Semantics Tasks?0
Using Embedding Masks for Word Categorization0
Using Gaze Data to Predict Multiword Expressions0
Using k-way Co-occurrences for Learning Word Embeddings0
Show:102550
← PrevPage 262 of 401Next →

No leaderboard results yet.