SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 39813990 of 4002 papers

TitleStatusHype
Reflection-based Word Attribute TransferCode0
Unsupervised Approach to Evaluate Sentence-Level Fluency: Do We Really Need Reference?Code0
An Automatic Question Usability Evaluation ToolkitCode0
Improve Chinese Word Embeddings by Exploiting Internal StructureCode0
Building a Kannada POS Tagger Using Machine Learning and Neural Network ModelsCode0
Words with Consistent Diachronic Usage Patterns are Learned Earlier: A Computational Analysis Using Temporally Aligned Word EmbeddingsCode0
Multi-granular Legal Topic Classification on Greek LegislationCode0
Improved Biomedical Word Embeddings in the Transformer EraCode0
Multi hash embeddings in spaCyCode0
Multi-label Categorization of Accounts of Sexism using a Neural FrameworkCode0
Show:102550
← PrevPage 399 of 401Next →

No leaderboard results yet.