SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28612870 of 4002 papers

TitleStatusHype
A Classification-Based Approach to Cognate Detection Combining Orthographic and Semantic Similarity Information0
A Closer Look on Unsupervised Cross-lingual Word Embeddings Mapping0
A comparative analysis of embedding models for patent similarity0
A Comparative Study of Embedding Models in Predicting the Compositionality of Multiword Expressions0
A Comparative Study of Neural Network Models for Sentence Classification0
A Comparative Study of Transformers on Word Sense Disambiguation0
A comparative study of word embeddings and other features for lexical complexity detection in French0
A Comparative Study of Word Embeddings for Reading Comprehension0
A Comparison of Architectures and Pretraining Methods for Contextualized Multilingual Word Embeddings0
A Comparison of Context-sensitive Models for Lexical Substitution0
Show:102550
← PrevPage 287 of 401Next →

No leaderboard results yet.