SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13611370 of 4002 papers

TitleStatusHype
Japanese Word Readability Assessment using Word Embeddings0
Discovering Bilingual Lexicons in Polyglot Word Embeddings0
Query Focused Multi-document Summarisation of Biomedical Texts: Macquarie Universiy and the Australian National University at BioASQ8bCode0
GREEK-BERT: The Greeks visiting Sesame StreetCode1
Multimodal Learning for Cardiovascular Risk Prediction using EHR Data0
Query Focused Multi-document Summarisation of Biomedical TextsCode0
Contextualized moral inference0
Simple Unsupervised Similarity-Based Aspect ExtractionCode0
Two Stages Approach for Tweet Engagement Prediction0
Predicting Helpfulness of Online Reviews0
Show:102550
← PrevPage 137 of 401Next →

No leaderboard results yet.