SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 926950 of 4002 papers

TitleStatusHype
Detecting Anxiety through RedditCode0
A Bi-Encoder LSTM Model For Learning Unstructured DialogsCode0
Cross-lingual Argumentation Mining: Machine Translation (and a bit of Projection) is All You Need!Code0
Cross-Lingual BERT Transformation for Zero-Shot Dependency ParsingCode0
Analyzing the Surprising Variability in Word Embedding Stability Across LanguagesCode0
Diachronic Word Embeddings Reveal Statistical Laws of Semantic ChangeCode0
Deriving Disinformation Insights from Geolocalized Twitter CalloutsCode0
CWTM: Leveraging Contextualized Word Embeddings from BERT for Neural Topic ModelingCode0
Cross-lingual Dependency Parsing with Unlabeled Auxiliary LanguagesCode0
Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention SelectionCode0
A Comprehensive Comparison of Word Embeddings in Event & Entity Coreference ResolutionCode0
Revisiting Tri-training of Dependency ParsersCode0
RNN Embeddings for Identifying Difficult to Understand Medical WordsCode0
Cross-lingual Lexical Sememe PredictionCode0
Robust Cross-lingual Embeddings from Parallel SentencesCode0
Cross-lingual Models of Word Embeddings: An Empirical ComparisonCode0
Robust Gram EmbeddingsCode0
BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model PerformanceCode0
Russian word sense induction by clustering averaged word embeddingsCode0
Dependency Sensitive Convolutional Neural Networks for Modeling Sentences and DocumentsCode0
Design and Implementation of a Quantum Kernel for Natural Language ProcessingCode0
An Analysis of Euclidean vs. Graph-Based Framing for Bilingual Lexicon Induction from Word Embedding SpacesCode0
SART - Similarity, Analogies, and Relatedness for Tatar Language: New Benchmark Datasets for Word Embeddings EvaluationCode0
SChME at SemEval-2020 Task 1: A Model Ensemble for Detecting Lexical Semantic ChangeCode0
Diagnosing BERT with Retrieval HeuristicsCode0
Show:102550
← PrevPage 38 of 161Next →

No leaderboard results yet.