SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 31813190 of 4002 papers

TitleStatusHype
Decision-Directed Data DecompositionCode0
Context encoders as a simple but powerful extension of word2vecCode0
Resource-Size matters: Improving Neural Named Entity Recognition with Optimized Large CorporaCode0
Words as Bridges: Exploring Computational Support for Cross-Disciplinary Translation WorkCode0
A Resource-Free Evaluation Metric for Cross-Lingual Word Embeddings Based on Graph ModularityCode0
Are Girls Neko or Shōjo? Cross-Lingual Alignment of Non-Isomorphic Embeddings with Iterative NormalizationCode0
Multiple Word Embeddings for Increased Diversity of RepresentationCode0
Multiplex Word Embeddings for Selectional Preference AcquisitionCode0
InfiniteWalk: Deep Network Embeddings as Laplacian Embeddings with a NonlinearityCode0
BioSentVec: creating sentence embeddings for biomedical textsCode0
Show:102550
← PrevPage 319 of 401Next →

No leaderboard results yet.