SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 621630 of 4002 papers

TitleStatusHype
BOUN-ISIK Participation: An Unsupervised Approach for the Named Entity Normalization and Relation Extraction of Bacteria Biotopes0
bot.zen @ EmpiriST 2015 - A minimally-deep learning PoS-tagger (trained for German CMC and Web data)0
Abstractive Text Summarization: Enhancing Sequence-to-Sequence Models Using Word Sense Disambiguation and Semantic Content Generalization0
Borrow a Little from your Rich Cousin: Using Embeddings and Polarities of English Words for Multilingual Sentiment Classification0
Bootstrapping Polar-Opposite Emotion Dimensions from Online Reviews0
Any-gram Kernels for Sentence Classification: A Sentiment Analysis Case Study0
Bootstrapping NLU Models with Multi-task Learning0
Bootstrapping Multilingual AMR with Contextual Word Alignments0
基於相依詞向量的剖析結果重估與排序(N-best Parse Rescoring Based on Dependency-Based Word Embeddings)0
An Unsupervised System for Parallel Corpus Filtering0
Show:102550
← PrevPage 63 of 401Next →

No leaderboard results yet.