SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32213230 of 4002 papers

TitleStatusHype
Multi-Task Text Classification using Graph Convolutional Networks for Large-Scale Low Resource LanguageCode0
Unsupervised Domain Adaptation of Contextualized Embeddings for Sequence LabelingCode0
ReviewViz: Assisting Developers Perform Empirical Study on Energy Consumption Related Reviews for Mobile ApplicationsCode0
Deep Pivot-Based Modeling for Cross-language Cross-domain Transfer with Minimal GuidanceCode0
Constructing Colloquial Dataset for Persian Sentiment Analysis of Social MicroblogsCode0
Intelligent Word Embeddings of Free-Text Radiology ReportsCode0
Concatenated Power Mean Word Embeddings as Universal Cross-Lingual Sentence RepresentationsCode0
Interactive Refinement of Cross-Lingual Word EmbeddingsCode0
Compressing Word Embeddings via Deep Compositional Code LearningCode0
Learning Embeddings into Entropic Wasserstein SpacesCode0
Show:102550
← PrevPage 323 of 401Next →

No leaderboard results yet.