SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 21812190 of 4002 papers

TitleStatusHype
Semi-automated extraction of research topics and trends from NCI funding in radiological sciences from 2000-20200
Semi-automatic WordNet Linking using Word Embeddings0
SEMIE: SEMantically Infused Embeddings with Enhanced Interpretability for Domain-specific Small Corpus0
Semi-Supervised Bootstrapping of Relationship Extractors with Distributional Semantics0
Semi-supervised Convolutional Neural Networks for Text Categorization via Region Embedding0
Semi-supervised Dependency Parsing using Bilexical Contextual Features from Auto-Parsed Data0
Semi-Supervised Instance Population of an Ontology using Word Vector Embeddings0
Semi-supervised Learning with Multi-Domain Sentiment Word Embeddings0
Meta-Embedding as Auxiliary Task Regularization0
SemR-11: A Multi-Lingual Gold-Standard for Semantic Similarity and Relatedness for Eleven Languages0
Show:102550
← PrevPage 219 of 401Next →

No leaderboard results yet.