SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 29412950 of 4002 papers

TitleStatusHype
Multilingual segmentation based on neural networks and pre-trained word embeddings0
Multilingual Sentiment Analysis: An RNN-Based Framework for Limited Data0
Multilingual Seq2seq Training with Similarity Loss for Cross-Lingual Document Classification0
Multilingual Training of Crosslingual Word Embeddings0
Multilingual Visual Sentiment Concept Matching0
Multilingual Word Embeddings for Low-Resource Languages using Anchors and a Chain of Related Languages0
Multilingual Word Embeddings using Multigraphs0
Multilingual Wordnet sense Ranking using nearest context0
Multimedia Lab @ ACL WNUT NER Shared Task: Named Entity Recognition for Twitter Microposts using Distributed Word Representations0
Multi-Modal Cognitive Maps based on Neural Networks trained on Successor Representations0
Show:102550
← PrevPage 295 of 401Next →

No leaderboard results yet.