SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 16811690 of 4002 papers

TitleStatusHype
A Novel Method of Extracting Topological Features from Word Embeddings0
Data-driven models and computational tools for neurolinguistics: a language technology perspectiveCode0
Temporal Embeddings and Transformer Models for Narrative Text Understanding0
Distant Supervision and Noisy Label Learning for Low Resource Named Entity Recognition: A Study on Hausa and Yorùbá0
A Machine Learning Application for Raising WASH Awareness in the Times of COVID-19 Pandemic0
Leveraging Foreign Language Labeled Data for Aspect-Based Opinion Mining0
Text Similarity Using Word Embeddings to Classify Misinformation0
Word Sense Disambiguation for 158 Languages using Word Embeddings Only0
Using word embeddings to improve the discriminability of co-occurrence text networks0
A Precisely Xtreme-Multi Channel Hybrid Approach For Roman Urdu Sentiment Analysis0
Show:102550
← PrevPage 169 of 401Next →

No leaderboard results yet.