SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 651660 of 4002 papers

TitleStatusHype
Addressing the Challenges of Cross-Lingual Hate Speech Detection0
Compressing Word Embeddings Using Syllables0
Diagnosing BERT with Retrieval HeuristicsCode0
D-Graph: AI-Assisted Design Concept Exploration Graph0
Embeddings Evaluation Using a Novel Measure of Semantic SimilarityCode0
HuSpaCy: an industrial-strength Hungarian natural language processing toolkitCode1
Applying Word Embeddings to Measure Valence in Information Operations Targeting Journalists in Brazil0
Semi-automatic WordNet Linking using Word Embeddings0
Predicting Influenza A Viral Host Using PSSM and Word Embeddings0
Which Student is Best? A Comprehensive Knowledge Distillation Exam for Task-Specific BERT Models0
Show:102550
← PrevPage 66 of 401Next →

No leaderboard results yet.