SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 2130 of 4002 papers

TitleStatusHype
Words as Bridges: Exploring Computational Support for Cross-Disciplinary Translation WorkCode0
Poisson-Process Topic Model for Integrating Knowledge from Pre-trained Language Models0
SVIP: Semantically Contextualized Visual Patches for Zero-Shot Learning0
Sentiment Analysis in SemEval: A Review of Sentiment Identification Approaches0
SoftMatcha: A Soft and Fast Pattern Matcher for Billion-Scale Corpus Searches0
Revisiting Word Embeddings in the LLM Era0
From Small to Large Language Models: Revisiting the Federalist PapersCode0
Extracting domain-specific terms using contextual word embeddings0
An Improved Deep Learning Model for Word Embeddings Based Clustering for Large Text Datasets0
Rumor Detection by Multi-task Suffix Learning based on Time-series Dual Sentiments0
Show:102550
← PrevPage 3 of 401Next →

No leaderboard results yet.