SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11711180 of 4002 papers

TitleStatusHype
Exploring Diachronic Lexical Semantics with JeSemECode0
Learning to Represent Bilingual DictionariesCode0
Plumeria at SemEval-2022 Task 6: Robust Approaches for Sarcasm Detection for English and Arabic Using Transformers and Data AugmentationCode0
Deeper Attention to Abusive User Content Moderation0
Deep Dialog Act Recognition using Multiple Token, Segment, and Context Information Representations0
AWE: Asymmetric Word Embedding for Textual Entailment0
An Empirical Study of Discriminative Sequence Labeling Models for Vietnamese Text Processing0
Deep Convolutional Neural Networks for Sentiment Analysis of Short Texts0
Deep Contextualized Word Embeddings in Transition-Based and Graph-Based Dependency Parsing - A Tale of Two Parsers Revisited0
Deep Contextualized Word Embeddings in Transition-Based and Graph-Based Dependency Parsing -- A Tale of Two Parsers Revisited0
Show:102550
← PrevPage 118 of 401Next →

No leaderboard results yet.