SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 901910 of 4002 papers

TitleStatusHype
Abstractive Document Summarization with Word Embedding Reconstruction0
Cooperative Self-training of Machine Reading Comprehension0
CSReader at SemEval-2018 Task 11: Multiple Choice Question Answering as Textual Entailment0
Coordination Boundary Identification without Labeled Data for Compound Terms Disambiguation0
CopyBERT: A Unified Approach to Question Generation with Self-Attention0
Attention Modeling for Targeted Sentiment0
Corporate IT-support Help-Desk Process Hybrid-Automation Solution with Machine Learning Approach0
Corpus specificity in LSA and Word2vec: the role of out-of-domain documents0
Correcting the Common Discourse Bias in Linear Representation of Sentences using Conceptors0
Cultural Cartography with Word Embeddings0
Show:102550
← PrevPage 91 of 401Next →

No leaderboard results yet.