SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32813290 of 4002 papers

TitleStatusHype
Character Composition Model with Convolutional Neural Networks for Dependency Parsing on Morphologically Rich LanguagesCode0
The Importance of Automatic Syntactic Features in Vietnamese Named Entity Recognition0
ASR error management for improving spoken language understanding0
Second-Order Word Embeddings from Nearest Neighbor Topological FeaturesCode0
Contextualizing Citations for Scientific Summarization using Word Embeddings and Domain Knowledge0
Lightweight Efficient Multi-keyword Ranked Search over Encrypted Cloud Data using Dual Word Embeddings0
Learning Semantic Relatedness From Human Feedback Using Metric Learning0
Mixed Membership Word Embeddings for Computational Social Science0
Utility of General and Specific Word Embeddings for Classifying Translational Stages of Research0
Evaluating vector-space models of analogy0
Show:102550
← PrevPage 329 of 401Next →

No leaderboard results yet.