SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28412850 of 4002 papers

TitleStatusHype
Linguistically-Informed Self-Attention for Semantic Role LabelingCode0
Bilingual Embeddings with Random Walks over Multilingual Wordnets0
Can Eye Movement Data Be Used As Ground Truth For Word Embeddings Evaluation?0
NE-Table: A Neural key-value table for Named EntitiesCode0
Automated essay scoring with string kernels and word embeddings0
Dynamic Meta-Embeddings for Improved Sentence RepresentationsCode0
A Deep Representation Empowered Distant Supervision Paradigm for Clinical Information Extraction0
Probabilistic Word Association for Dialogue Act Classification with Recurrent Neural NetworksCode0
Utilizing Neural Networks and Linguistic Metadata for Early Detection of Depression Indications in Text SequencesCode1
LightRel SemEval-2018 Task 7: Lightweight and Fast Relation Classification0
Show:102550
← PrevPage 285 of 401Next →

No leaderboard results yet.