SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12011210 of 4002 papers

TitleStatusHype
Enhanced word embeddings using multi-semantic representation through lexical chainsCode0
Censorship of Online Encyclopedias: Implications for NLP Models0
BERT Transformer model for Detecting Arabic GPT2 Auto-Generated Tweets0
Evaluating Multilingual Text Encoders for Unsupervised Cross-Lingual RetrievalCode0
Multi-sense embeddings through a word sense disambiguation processCode0
Hostility Detection and Covid-19 Fake News Detection in Social Media0
Experimental Evaluation of Deep Learning models for Marathi Text Classification0
Evaluation of Deep Learning Models for Hostility Detection in Hindi Text0
Clustering Word Embeddings with Self-Organizing Maps. Application on LaRoSeDa -- A Large Romanian Sentiment Data SetCode0
Eating Garlic Prevents COVID-19 Infection: Detecting Misinformation on the Arabic Content of TwitterCode0
Show:102550
← PrevPage 121 of 401Next →

No leaderboard results yet.