SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33813390 of 4002 papers

TitleStatusHype
Representation Bias of Adolescents in AI: A Bilingual, Bicultural StudyCode0
NL-FIIT at IEST-2018: Emotion Recognition utilizing Neural Networks and Multi-level PreprocessingCode0
Bidirectional LSTM-CRF for Clinical Concept ExtractionCode0
Towards Understanding Gender Bias in Relation ExtractionCode0
Does mBERT understand Romansh? Evaluating word embeddings using word alignmentCode0
Second-Order Word Embeddings from Nearest Neighbor Topological FeaturesCode0
Towards Understanding the Geometry of Knowledge Graph EmbeddingsCode0
Towards Understanding the Word Sensitivity of Attention Layers: A Study via Random FeaturesCode0
Bidirectional Attention as a Mixture of Continuous Word ExpertsCode0
NNVLP: A Neural Network-Based Vietnamese Language Processing ToolkitCode0
Show:102550
← PrevPage 339 of 401Next →

No leaderboard results yet.