SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 21512160 of 4002 papers

TitleStatusHype
Self-Supervised learning with cross-modal transformers for emotion recognition0
Self-training improves Recurrent Neural Networks performance for Temporal Relation Extraction0
Semantic Annotation Aggregation with Conditional Crowdsourcing Models and Word Embeddings0
Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning0
Semantic-aware transformation of short texts using word embeddings: An application in the Food Computing domain0
Semantic Change and Semantic Stability: Variation is Key0
Semantic Change in the Language of UK Parliamentary Debates0
Semantic Clustering and Convolutional Neural Network for Short Text Categorization0
Semantic Features Based on Word Alignments for Estimating Quality of Text Simplification0
Semantic Frame Embeddings for Detecting Relations between Software Requirements0
Show:102550
← PrevPage 216 of 401Next →

No leaderboard results yet.