SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26262650 of 4002 papers

TitleStatusHype
Using pseudo-senses for improving the extraction of synonyms from word embeddings0
Using reading behavior to predict grammatical functions0
Using Sentences as Semantic Representations in Large Scale Zero-Shot Learning0
Using time series and natural language processing to identify viral moments in the 2016 U.S. Presidential Debate0
Using virtual edges to extract keywords from texts modeled as complex networks0
Using Word Embedding for Cross-Language Plagiarism Detection0
UsingWord Embedding for Cross-Language Plagiarism Detection0
Using Word Embeddings for Automatic Query Expansion0
Using Word Embeddings for Bilingual Unsupervised WSD0
Using Word Embeddings for Improving Statistical Machine Translation of Phrasal Verbs0
Using Word Embeddings for Italian Crime News Categorization0
UsingWord Embeddings for Query Translation for Hindi to English Cross Language Information Retrieval0
Using Word Embeddings for Unsupervised Acronym Disambiguation0
Using Word Embeddings for Visual Data Exploration with Ontodia and Wikidata0
Using Word Embeddings in Twitter Election Classification0
Using Word Embeddings to Analyze Protests News0
Using Word Embeddings to Analyze Teacher Evaluations: An Application to a Filipino Education Non-Profit Organization0
Using Word Embeddings to Explore the Learned Representations of Convolutional Neural Networks0
Using word embeddings to improve the discriminability of co-occurrence text networks0
Using Word Embeddings to Quantify Ethnic Stereotypes in 12 years of Spanish News0
Using Word Embeddings to Translate Named Entities0
Using Word Embeddings to Uncover Discourses0
UTA DLNLP at SemEval-2016 Task 1: Semantic Textual Similarity: A Unified Framework for Semantic Processing and Evaluation0
Utility of General and Specific Word Embeddings for Classifying Translational Stages of Research0
Utilizing Character and Word Embeddings for Text Normalization with Sequence-to-Sequence Models0
Show:102550
← PrevPage 106 of 161Next →

No leaderboard results yet.