SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 551575 of 4002 papers

TitleStatusHype
Learning to Name Classes for Vision and Language Models0
Automatic Generation of Multiple-Choice Questions0
Addressing Biases in the Texts using an End-to-End Pipeline Approach0
Uncovering Challenges of Solving the Continuous Gromov-Wasserstein ProblemCode0
Classifying Text-Based Conspiracy Tweets related to COVID-19 using Contextualized Word Embeddings0
Changes in Commuter Behavior from COVID-19 Lockdowns in the Atlanta Metropolitan Area0
Deep learning model for Mongolian Citizens Feedback Analysis using Word Vector Embeddings0
Exploring Category Structure with Contextual Language Models and Lexical Semantic Networks0
Evaluation of Word Embeddings for the Social Sciences0
Dialectograms: Machine Learning Differences between Discursive Communities0
Zero-Shot Learning for Requirements Classification: An Exploratory StudyCode0
Vision-Language Models Performing Zero-Shot Tasks Exhibit Gender-based Disparities0
Machine Translation for Accessible Multi-Language Text Analysis0
Language Embeddings Sometimes Contain Typological GeneralizationsCode0
News and Load: A Quantitative Exploration of Natural Language Processing Applications for Forecasting Day-ahead Electricity System Demand0
The 2022 n2c2/UW Shared Task on Extracting Social Determinants of HealthCode0
SensePOLAR: Word sense aware interpretability for pre-trained contextual word embeddingsCode0
Online Fake Review Detection Using Supervised Machine Learning And BERT Model0
Analyzing the Representational Geometry of Acoustic Word Embeddings0
Supervised Acoustic Embeddings And Their Transferability Across LanguagesCode0
The Undesirable Dependence on Frequency of Gender Bias Metrics Based on Word EmbeddingsCode0
Using meaning instead of words to track topics0
TegFormer: Topic-to-Essay Generation with Good Topic Coverage and High Text Coherence0
Text classification in shipping industry using unsupervised models and Transformer based supervised models0
Exploring Interpretability of Independent Components of Word Embeddings with Automated Word Intruder Test0
Show:102550
← PrevPage 23 of 161Next →

No leaderboard results yet.