SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 171180 of 4002 papers

TitleStatusHype
Advancing Fake News Detection: Hybrid DeepLearning with FastText and Explainable AI0
A comparative analysis of embedding models for patent similarity0
An efficient domain-independent approach for supervised keyphrase extraction and ranking0
Empowering Segmentation Ability to Multi-modal Large Language ModelsCode0
Leveraging Linguistically Enhanced Embeddings for Open Information Extraction0
Improving Acoustic Word Embeddings through Correspondence Training of Self-supervised Speech RepresentationsCode0
Identifying and interpreting non-aligned human conceptual representations using language modeling0
VNLP: Turkish NLP PackageCode2
Learning Intrinsic Dimension via Information Bottleneck for Explainable Aspect-based Sentiment Analysis0
The Foundational Capabilities of Large Language Models in Predicting Postoperative Risks Using Clinical NotesCode0
Show:102550
← PrevPage 18 of 401Next →

No leaderboard results yet.