SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 24912500 of 4002 papers

TitleStatusHype
Traffic event description based on Twitter data using Unsupervised Learning Methods for Indian road conditions0
Training and Evaluating Multimodal Word Embeddings with Large-scale Web Annotated Images0
Training Word Sense Embeddings With Lexicon-based Regularization0
Transcending the "Male Code": Implicit Masculine Biases in NLP Contexts0
TransDrift: Modeling Word-Embedding Drift using Transformer0
Transfer and Multi-Task Learning for Noun--Noun Compound Interpretation0
Transfer Learning across Low-Resource, Related Languages for Neural Machine Translation0
Transfer Learning in Natural Language Processing0
Transferred Embeddings for Igbo Similarity, Analogy, and Diacritic Restoration Tasks0
Transferring Coreference Resolvers with Posterior Regularization0
Show:102550
← PrevPage 250 of 401Next →

No leaderboard results yet.