SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 18311840 of 4002 papers

TitleStatusHype
Utilizing Word Embeddings based Features for Phylogenetic Tree Generation of Sanskrit Texts0
Language-Agnostic Visual-Semantic EmbeddingsCode0
Specializing Word Embeddings (for Parsing) by Information BottleneckCode0
Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word RepresentationsCode0
Additional Shared Decoder on Siamese Multi-view Encoders for Learning Acoustic Word Embeddings0
Bad Form: Comparing Context-Based and Form-Based Few-Shot Learning in Distributional Semantic Models0
A Pilot Study for Chinese SQL Semantic ParsingCode2
Learning Category Correlations for Multi-label Image Recognition with Graph Networks0
On the Importance of Subword Information for Morphological Tasks in Truly Low-Resource Languages0
Learn Interpretable Word Embeddings Efficiently with von Mises-Fisher Distribution0
Show:102550
← PrevPage 184 of 401Next →

No leaderboard results yet.