SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 701710 of 4002 papers

TitleStatusHype
Fusing Document, Collection and Label Graph-based Representations with Word Embeddings for Text ClassificationCode0
GAProtoNet: A Multi-head Graph Attention-based Prototypical Network for Interpretable Text ClassificationCode0
Bigrams and BiLSTMs Two Neural Networks for Sequential Metaphor DetectionCode0
Generalizing Word Embeddings using Bag of SubwordsCode0
Are Girls Neko or Shōjo? Cross-Lingual Alignment of Non-Isomorphic Embeddings with Iterative NormalizationCode0
Churn Intent Detection in Multilingual Chatbot Conversations and Social MediaCode0
Abolitionist Networks: Modeling Language Change in Nineteenth-Century Activist NewspapersCode0
Deep convolutional acoustic word embeddings using word-pair side informationCode0
A Resource-Light Method for Cross-Lingual Semantic Textual SimilarityCode0
Deep Image-to-Recipe TranslationCode0
Show:102550
← PrevPage 71 of 401Next →

No leaderboard results yet.