SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 631640 of 4002 papers

TitleStatusHype
Brazilian Lyrics-Based Music Genre Classification Using a BLSTM Network0
Breaking Down Word Semantics from Pre-trained Language Models through Layer-wise Dimension Selection0
``A Passage to India'': Pre-trained Word Embeddings for Indian Languages0
A Cross-lingual Natural Language Processing Framework for Infodemic Management0
A Hybrid Learning Scheme for Chinese Word Embedding0
Bridging the Defined and the Defining: Exploiting Implicit Lexical Semantic Relations in Definition Modeling0
ClaiRE at SemEval-2018 Task 7: Classification of Relations using Embeddings0
Bridging the Modality Gap: Enhancing Channel Prediction with Semantically Aligned LLMs and Knowledge Distillation0
A Platform Agnostic Dual-Strand Hate Speech Detector0
Classifying Out-of-vocabulary Terms in a Domain-Specific Social Media Corpus0
Show:102550
← PrevPage 64 of 401Next →

No leaderboard results yet.