SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12711280 of 4002 papers

TitleStatusHype
Efficient Data Selection for Bilingual Terminology Extraction from Comparable Corpora0
"A Passage to India": Pre-trained Word Embeddings for Indian Languages0
Efficient Extraction of Pseudo-Parallel Sentences from Raw Monolingual Data Using Word Embeddings0
Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space0
Determining Gains Acquired from Word Embedding Quantitatively Using Discrete Distribution Clustering0
Determining Code Words in Euphemistic Hate Speech Using Word Embedding Networks0
Bridging the Modality Gap: Enhancing Channel Prediction with Semantically Aligned LLMs and Knowledge Distillation0
Evaluating Word Embeddings in Extremely Under-Resourced Languages: A Case Study in Bribri0
EICA Team at SemEval-2018 Task 2: Semantic and Metadata-based Features for Multilingual Emoji Prediction0
Beyond Bilingual: Multi-sense Word Embeddings using Multilingual Context0
Show:102550
← PrevPage 128 of 401Next →

No leaderboard results yet.