SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23412350 of 4002 papers

TitleStatusHype
The Cinderella Complex: Word Embeddings Reveal ender Stereotypes in Movies and BooksCode0
Syntax Helps ELMo Understand Semantics: Is Syntax Still Relevant in a Deep Neural Architecture for SRL?0
Learning Semantic Representations for Novel Words: Leveraging Both Form and ContextCode0
Neural sequence labeling for Vietnamese POS Tagging and NER0
Deep Neural Networks for Query Expansion using Word Embeddings0
microNER: A Micro-Service for German Named Entity Recognition based on BiLSTM-CRF0
Learning acoustic word embeddings with phonetically associated triplet network0
Semantic Term "Blurring" and Stochastic "Barcoding" for Improved Unsupervised Text Classification0
Unsupervised Hyperalignment for Multilingual Word Embeddings0
Combining Long Short Term Memory and Convolutional Neural Network for Cross-Sentence n-ary Relation Extraction0
Show:102550
← PrevPage 235 of 401Next →

No leaderboard results yet.