SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17411750 of 4002 papers

TitleStatusHype
Robust Cross-lingual Embeddings from Parallel SentencesCode0
Encoding word order in complex embeddingsCode0
Job Prediction: From Deep Neural Network Models to Applications0
One-Shot Weakly Supervised Video Object Segmentation0
Analyzing Structures in the Semantic Vector Space: A Framework for Decomposing Word EmbeddingsCode0
The performance evaluation of Multi-representation in the Deep Learning models for Relation Extraction Task0
Predicting the Outcome of Judicial Decisions made by the European Court of Human Rights0
A Comparison of Architectures and Pretraining Methods for Contextualized Multilingual Word Embeddings0
Artificial mental phenomena: Psychophysics as a framework to detect perception biases in AI models0
Integrating Lexical Knowledge in Word Embeddings using Sprinkling and Retrofitting0
Show:102550
← PrevPage 175 of 401Next →

No leaderboard results yet.