SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28212830 of 4002 papers

TitleStatusHype
A supervised approach to taxonomy extraction using word embeddings0
Evaluation of Domain-specific Word Embeddings using Knowledge Resources0
Handling Normalization Issues for Part-of-Speech Tagging of Online Conversational Text0
Urdu Word EmbeddingsCode0
Constructing High Quality Sense-specific Corpus and Word Embedding via Unsupervised Elimination of Pseudo Multi-sense0
Language adaptation experiments via cross-lingual embeddings for related languages0
Lexical and Semantic Features for Cross-lingual Text Reuse Classification: an Experiment in English and Latin Paraphrases0
All-words Word Sense Disambiguation Using Concept Embeddings0
Word Embedding Approach for Synonym Extraction of Multi-Word TermsCode0
Ensemble Romanian Dependency Parsing with Neural Networks0
Show:102550
← PrevPage 283 of 401Next →

No leaderboard results yet.