SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28512860 of 4002 papers

TitleStatusHype
Closed Form Word Embedding Alignment0
Abstractive Document Summarization with Word Embedding Reconstruction0
Abstractive Text Summarization: Enhancing Sequence-to-Sequence Models Using Word Sense Disambiguation and Semantic Content Generalization0
Abusive language in Spanish children and young teenager's conversations: data preparation and short text classification with contextual word embeddings0
A Call for More Rigor in Unsupervised Cross-lingual Learning0
A Case Study to Reveal if an Area of Interest has a Trend in Ongoing Tweets Using Word and Sentence Embeddings0
A* CCG Parsing with a Supertag-factored Model0
Accurate Dependency Parsing and Tagging of Latin0
A Challenge Set and Methods for Noun-Verb Ambiguity0
A Chinese Writing Correction System for Learning Chinese as a Foreign Language0
Show:102550
← PrevPage 286 of 401Next →

No leaderboard results yet.