SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17011710 of 4002 papers

TitleStatusHype
A Minimalist Approach to Shallow Discourse Parsing and Implicit Relation Recognition0
Additional Shared Decoder on Siamese Multi-view Encoders for Learning Acoustic Word Embeddings0
Exploring the Combination of Contextual Word Embeddings and Knowledge Graph Embeddings0
Hallym: Named Entity Recognition on Twitter with Word Representation0
Handling Homographs in Neural Machine Translation0
Handling Normalization Issues for Part-of-Speech Tagging of Online Conversational Text0
Handling Out-Of-Vocabulary Problem in Hangeul Word Embeddings0
Code-switching Language Modeling With Bilingual Word Embeddings: A Case Study for Egyptian Arabic-English0
Hash2Vec, Feature Hashing for Word Embeddings0
Exploring sentence informativeness0
Show:102550
← PrevPage 171 of 401Next →

No leaderboard results yet.