SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26412650 of 4002 papers

TitleStatusHype
Using Word Embeddings to Analyze Protests News0
Using Word Embeddings to Analyze Teacher Evaluations: An Application to a Filipino Education Non-Profit Organization0
Using Word Embeddings to Explore the Learned Representations of Convolutional Neural Networks0
Using word embeddings to improve the discriminability of co-occurrence text networks0
Using Word Embeddings to Quantify Ethnic Stereotypes in 12 years of Spanish News0
Using Word Embeddings to Translate Named Entities0
Using Word Embeddings to Uncover Discourses0
UTA DLNLP at SemEval-2016 Task 1: Semantic Textual Similarity: A Unified Framework for Semantic Processing and Evaluation0
Utility of General and Specific Word Embeddings for Classifying Translational Stages of Research0
Utilizing Character and Word Embeddings for Text Normalization with Sequence-to-Sequence Models0
Show:102550
← PrevPage 265 of 401Next →

No leaderboard results yet.