SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 35413550 of 4002 papers

TitleStatusHype
SemanticZ at SemEval-2016 Task 3: Ranking Relevant Answers in Community Question Answering Using Semantic Similarity Based on Fine-tuned Word EmbeddingsCode0
A Survey on Sentence Embedding Models Performance for Patent AnalysisCode0
Temporal Word Analogies: Identifying Lexical Replacement with Diachronic Word EmbeddingsCode0
Enriching Word Embeddings with Temporal and Spatial InformationCode0
Enriching Word Vectors with Subword InformationCode0
An Open-World Extension to Knowledge Graph Completion ModelsCode0
VCDM: Leveraging Variational Bi-encoding and Deep Contextualized Word Representations for Improved Definition ModelingCode0
Learning Neural Word Salience ScoresCode0
On the Interpretability and Significance of Bias Metrics in Texts: a PMI-based ApproachCode0
Better Summarization Evaluation with Word Embeddings for ROUGECode0
Show:102550
← PrevPage 355 of 401Next →

No leaderboard results yet.