SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 91100 of 4002 papers

TitleStatusHype
TS-HTFA: Advancing Time Series Forecasting via Hierarchical Text-Free Alignment with Large Language Models0
GAProtoNet: A Multi-head Graph Attention-based Prototypical Network for Interpretable Text ClassificationCode0
Target word activity detector: An approach to obtain ASR word boundaries without lexicon0
DiffEditor: Enhancing Speech Editing with Semantic Enrichment and Acoustic ConsistencyCode1
Visualizing Temporal Topic Embeddings with a Compass0
Analyzing Correlations Between Intrinsic and Extrinsic Bias Metrics of Static Word Embeddings With Their Measuring Biases Aligned0
Protecting Copyright of Medical Pre-trained Language Models: Training-Free Backdoor Model Watermarking0
A Simplified Retriever to Improve Accuracy of Phenotype Normalizations by Large Language Models0
Word and Phrase Features in Graph Convolutional Network for Automatic Question Classification0
From cart to truck: meaning shift through words in English in the last two centuries0
Show:102550
← PrevPage 10 of 401Next →

No leaderboard results yet.