SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 261270 of 4002 papers

TitleStatusHype
Application of Clinical Concept Embeddings for Heart Failure Prediction in UK EHR data0
Applying Occam’s Razor to Transformer-Based Dependency Parsing: What Works, What Doesn’t, and What is Really Necessary0
An Efficient Cross-lingual Model for Sentence Classification Using Convolutional Neural Network0
Anchor-based Bilingual Word Embeddings for Low-Resource Languages0
Adversarial Representation Learning for Text-to-Image Matching0
An Automatic Learning of an Algerian Dialect Lexicon by using Multilingual Word Embeddings0
A Comparison of Architectures and Pretraining Methods for Contextualized Multilingual Word Embeddings0
Examining Structure of Word Embeddings with PCA0
An Automated Method to Enrich Consumer Health Vocabularies Using GloVe Word Embeddings and An Auxiliary Lexical Resource0
An Attentive Fine-Grained Entity Typing Model with Latent Type Representation0
Show:102550
← PrevPage 27 of 401Next →

No leaderboard results yet.