SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14611470 of 4002 papers

TitleStatusHype
Hybrid Improved Document-level Embedding (HIDE)0
BERT-based Ensembles for Modeling Disclosure and Support in Conversational Social Media Text0
Data Augmentation with Unsupervised Machine Translation Improves the Structural Similarity of Cross-lingual Word Embeddings0
InfiniteWalk: Deep Network Embeddings as Laplacian Embeddings with a NonlinearityCode0
Quasi-orthonormal Encoding for Machine Learning ApplicationsCode0
Transition-based Semantic Dependency Parsing with Pointer NetworksCode1
TIME: Text and Image Mutual-Translation Adversarial Networks0
A Study of Neural Matching Models for Cross-lingual IR0
AutoSUM: Automating Feature Extraction and Multi-user Preference Simulation for Entity SummarizationCode0
Degree-Aware Alignment for Entities in TailCode0
Show:102550
← PrevPage 147 of 401Next →

No leaderboard results yet.