SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28112820 of 4002 papers

TitleStatusHype
Wordsurf : un outil pour naviguer dans un espace de « Word Embeddings » (Wordsurf : a tool to surf in a ``word embeddings'' space)0
Word vectors, reuse, and replicability: Towards a community repository of large-text resources0
WOVe: Incorporating Word Order in GloVe Word Embeddings0
XLNET-GRU Sentiment Regression Model for Cryptocurrency News in English and Malay0
XMU Neural Machine Translation Systems for WAT 20170
Contextualized End-to-End Neural Entity Linking0
YNUDLG at SemEval-2017 Task 4: A GRU-SVM Model for Sentiment Classification and Quantification in Twitter0
YNU-HPCC at IJCNLP-2017 Task 5: Multi-choice Question Answering in Exams Using an Attention-based LSTM Model0
You shall know a piece by the company it keeps. Chess plays as a data for word2vec models0
Yseop at FinSim-3 Shared Task 2021: Specializing Financial Domain Learning with Phrase Representations0
Show:102550
← PrevPage 282 of 401Next →

No leaderboard results yet.