SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32913300 of 4002 papers

TitleStatusHype
Bootstrapping Polar-Opposite Emotion Dimensions from Online Reviews0
Borrow a Little from your Rich Cousin: Using Embeddings and Polarities of English Words for Multilingual Sentiment Classification0
bot.zen @ EmpiriST 2015 - A minimally-deep learning PoS-tagger (trained for German CMC and Web data)0
BOUN-ISIK Participation: An Unsupervised Approach for the Named Entity Normalization and Relation Extraction of Bacteria Biotopes0
Brazilian Lyrics-Based Music Genre Classification Using a BLSTM Network0
Breaking Down Word Semantics from Pre-trained Language Models through Layer-wise Dimension Selection0
Bridging the Defined and the Defining: Exploiting Implicit Lexical Semantic Relations in Definition Modeling0
Bridging the Gap: Incorporating a Semantic Similarity Measure for Effectively Mapping PubMed Queries to Documents0
Bridging the Modality Gap: Enhancing Channel Prediction with Semantically Aligned LLMs and Knowledge Distillation0
Bringing Order to Neural Word Embeddings with Embeddings Augmented by Random Permutations (EARP)0
Show:102550
← PrevPage 330 of 401Next →

No leaderboard results yet.