SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 526550 of 4002 papers

TitleStatusHype
A* CCG Parsing with a Supertag-factored Model0
“Are you calling for the vaporizer you ordered?” Combining Search and Prediction to Identify Orders in Contact Centers0
Are Word Embeddings Really a Bad Fit for the Estimation of Thematic Fit?0
Accurate Dependency Parsing and Tagging of Latin0
BLISS in Non-Isometric Embedding Spaces0
Are Word Embedding-based Features Useful for Sarcasm Detection?0
A LSTM Approach with Sub-Word Embeddings for Mongolian Phrase Break Prediction0
Adapting word2vec to Named Entity Recognition0
A Review on Deep Learning Techniques Applied to Answer Selection0
A Review of Standard Text Classification Practices for Multi-label Toxicity Identification of Online Content0
A Locally Linear Procedure for Word Translation0
Blinov: Distributed Representations of Words for Aspect-Based Sentiment Analysis at SemEval 20140
Boosting Named Entity Recognition with Neural Character Embeddings0
All-words Word Sense Disambiguation Using Concept Embeddings0
A Review of Cross-Domain Text-to-SQL Models0
Bit Cipher -- A Simple yet Powerful Word Representation System that Integrates Efficiently with Language Models0
A Retrofitting Model for Incorporating Semantic Relations into Word Embeddings0
Adapting Topic Models using Lexical Associations with Tree Priors0
BLCU\_NLP at SemEval-2018 Task 12: An Ensemble Model for Argument Reasoning Based on Hierarchical Attention0
Adapting Pre-trained Word Embeddings For Use In Medical Coding0
Are Girls Neko or Sh\=ojo? Cross-Lingual Alignment of Non-Isomorphic Embeddings with Iterative Normalization0
All That Glitters is Not Gold: A Gold Standard of Adjective-Noun Collocations for German0
BIT at SemEval-2016 Task 1: Sentence Similarity Based on Alignments and Vector with the Weight of Information Content0
Blind signal decomposition of various word embeddings based on join and individual variance explained0
Bootstrap Domain-Specific Sentiment Classifiers from Unlabeled Corpora0
Show:102550
← PrevPage 22 of 161Next →

No leaderboard results yet.