SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 471480 of 4002 papers

TitleStatusHype
Cross-Lingual Word Representations via Spectral Graph EmbeddingsCode0
A Resource-Light Method for Cross-Lingual Semantic Textual SimilarityCode0
An Analysis of Euclidean vs. Graph-Based Framing for Bilingual Lexicon Induction from Word Embedding SpacesCode0
Fine-tuning Tree-LSTM for phrase-level sentiment classification on a Polish dependency treebank. Submission to PolEval task 2Code0
fMRI predictors based on language models of increasing complexity recover brain left lateralizationCode0
Automated Generation of Multilingual Clusters for the Evaluation of Distributed RepresentationsCode0
A Resource-Free Evaluation Metric for Cross-Lingual Word Embeddings Based on Graph ModularityCode0
Frame- and Entity-Based Knowledge for Common-Sense Argumentative ReasoningCode0
Are Girls Neko or Shōjo? Cross-Lingual Alignment of Non-Isomorphic Embeddings with Iterative NormalizationCode0
Cross-Lingual Word Embeddings for Turkic LanguagesCode0
Show:102550
← PrevPage 48 of 401Next →

No leaderboard results yet.