SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26012610 of 4002 papers

TitleStatusHype
USAAR-WLV: Hypernym Generation with Deep Neural Nets0
Usability and Accessibility of Bantu Language Dictionaries in the Digital Age: Mobile Access in an Open Environment0
Use Case: Romanian Language Resources in the LOD Paradigm0
Use Generalized Representations, But Do Not Forget Surface Features0
Use of unsupervised word classes for entity recognition: Application to the detection of disorders in clinical reports0
User recommendation system based on MIND dataset0
USF at SemEval-2019 Task 6: Offensive Language Detection Using LSTM With Word Embeddings0
USI-IR at IEST 2018: Sequence Modeling and Pseudo-Relevance Feedback for Implicit Emotion Detection0
Using Adversarial Debiasing to Remove Bias from Word Embeddings0
Controllable Speaking Styles Using a Large Language Model0
Show:102550
← PrevPage 261 of 401Next →

No leaderboard results yet.