SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 481490 of 4002 papers

TitleStatusHype
Learnt Contrastive Concept Embeddings for Sign Recognition0
Lightweight Adaptation of Neural Language Models via Subspace EmbeddingCode0
A Preliminary Study on a Conceptual Game Feature Generation and Recommendation System0
Gloss Alignment Using Word Embeddings0
Vocab-Expander: A System for Creating Domain-Specific Vocabularies Based on Word Embeddings0
3D-EX : A Unified Dataset of Definitions and Dictionary ExamplesCode0
Beyond One-Hot-Encoding: Injecting Semantics to Drive Image ClassifiersCode0
Lessons in Reproducibility: Insights from NLP Studies in Materials Science0
The flow of ideas in word embeddings0
Towards Resolving Word Ambiguity with Word Embeddings0
Show:102550
← PrevPage 49 of 401Next →

No leaderboard results yet.