SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13111320 of 4002 papers

TitleStatusHype
Demonstration of a Literature Based Discovery System based on Ontologies, Semantic Filters and Word Embeddings for the Raynaud Disease-Fish Oil Rediscovery0
TUE at SemEval-2020 Task 1: Detecting Semantic Change by Clustering Contextual Word Embeddings0
Go Simple and Pre-Train on Domain-Specific Corpora: On the Role of Training Data for Text Classification0
Manifold Learning-based Word Representation Refinement Incorporating Global and Local Information0
An Empirical Study of the Downstream Reliability of Pre-Trained Word Embeddings0
Blind signal decomposition of various word embeddings based on join and individual variance explained0
A Deep Content-Based Model for Persian Rumor Verification0
Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space0
Unsupervised Word Translation Pairing using Refinement based Point Set Registration0
Improving Biomedical Named Entity Recognition with Syntactic InformationCode0
Show:102550
← PrevPage 132 of 401Next →

No leaderboard results yet.