SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26712680 of 4002 papers

TitleStatusHype
Bigrams and BiLSTMs Two Neural Networks for Sequential Metaphor DetectionCode0
Multi-Module Recurrent Neural Networks with Transfer Learning0
SB@GU at the Complex Word Identification 2018 Shared Task0
Computationally Constructed Concepts: A Machine Learning Approach to Metaphor Interpretation Using Usage-Based Construction Grammatical Cues0
Conditional Random Fields for Metaphor Detection0
Entropy-Based Subword Mining with an Application to Word Embeddings0
Fusing Document, Collection and Label Graph-based Representations with Word Embeddings for Text ClassificationCode0
NILC at CWI 2018: Exploring Feature Engineering and Feature Learning0
Addressing Low-Resource Scenarios with Character-aware Embeddings0
Using Language Learner Data for Metaphor DetectionCode0
Show:102550
← PrevPage 268 of 401Next →

No leaderboard results yet.