SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 461470 of 4002 papers

TitleStatusHype
A Resource-Light Method for Cross-Lingual Semantic Textual SimilarityCode0
Exploring Neural Text Simplification ModelsCode0
Augmenting semantic lexicons using word embeddings and transfer learningCode0
Analyzing the Surprising Variability in Word Embedding Stability Across LanguagesCode0
A Resource-Free Evaluation Metric for Cross-Lingual Word Embeddings Based on Graph ModularityCode0
Analyzing Vietnamese Legal Questions Using Deep Neural Networks with Biaffine ClassifiersCode0
Are Girls Neko or Shōjo? Cross-Lingual Alignment of Non-Isomorphic Embeddings with Iterative NormalizationCode0
Fast and Robust Comparison of Probability Measures in Heterogeneous SpacesCode0
CS-Embed at SemEval-2020 Task 9: The effectiveness of code-switched word embeddings for sentiment analysisCode0
Debiasing Sentence Embedders through Contrastive Word PairsCode0
Show:102550
← PrevPage 47 of 401Next →

No leaderboard results yet.