SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 19811990 of 4002 papers

TitleStatusHype
Japanese Lexical Simplification for Non-Native Speakers0
Japanese Word Readability Assessment using Word Embeddings0
JCT at SemEval-2021 Task 1: Context-aware Representation for Lexical Complexity Prediction0
JeSemE: Interleaving Semantics and Emotions in a Web Service for the Exploration of Language Change Phenomena0
JeuxDeLiens: Word Embeddings and Path-Based Similarity for Entity Linking using the French JeuxDeMots Lexical Semantic Network0
JHU System Description for the MADAR Arabic Dialect Identification Shared Task0
Job Prediction: From Deep Neural Network Models to Applications0
Joint Embeddings of Chinese Words, Characters, and Fine-grained Subcharacter Components0
BERT-based Ensembles for Modeling Disclosure and Support in Conversational Social Media Text0
Dependency-Based Word Embeddings0
Show:102550
← PrevPage 199 of 401Next →

No leaderboard results yet.