SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 581590 of 4002 papers

TitleStatusHype
Can AI Generate Love Advice?: Toward Neural Answer Generation for Non-Factoid Questions0
Can Existing Methods Debias Languages Other than English? First Attempt to Analyze and Mitigate Japanese Word Embeddings0
A Preliminary Study on a Conceptual Game Feature Generation and Recommendation System0
A Precisely Xtreme-Multi Channel Hybrid Approach For Roman Urdu Sentiment Analysis0
Aligning Opinions: Cross-Lingual Opinion Mining with Dependencies0
Aligning Open IE Relations and KB Relations using a Siamese Network Based on Word Embedding0
Apprentissage de plongements lexicaux par une approche r\'eseaux complexes (Complex networks based word embeddings)0
Active Discriminative Text Representation Learning0
Text2Node: a Cross-Domain System for Mapping Arbitrary Phrases to a Taxonomy0
Apprentissage de plongements de mots sur des corpus en langue de sp\'ecialit\'e : une \'etude d'impact (Learning word embeddings on domain specific corpora : an impact study )0
Show:102550
← PrevPage 59 of 401Next →

No leaderboard results yet.