SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 351360 of 4002 papers

TitleStatusHype
Efficient and Flexible Topic Modeling using Pretrained Embeddings and Bag of SentencesCode1
Vision-Language Models Performing Zero-Shot Tasks Exhibit Gender-based Disparities0
Machine Translation for Accessible Multi-Language Text Analysis0
Language Embeddings Sometimes Contain Typological GeneralizationsCode0
News and Load: A Quantitative Exploration of Natural Language Processing Applications for Forecasting Day-ahead Electricity System Demand0
The 2022 n2c2/UW Shared Task on Extracting Social Determinants of HealthCode0
SensePOLAR: Word sense aware interpretability for pre-trained contextual word embeddingsCode0
Online Fake Review Detection Using Supervised Machine Learning And BERT Model0
Analyzing the Representational Geometry of Acoustic Word Embeddings0
Supervised Acoustic Embeddings And Their Transferability Across LanguagesCode0
Show:102550
← PrevPage 36 of 401Next →

No leaderboard results yet.