SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26312640 of 4002 papers

TitleStatusHype
Language Models for Code-switch Detection of te reo Māori and English in a Low-resource Setting0
Language Transfer Learning for Supervised Lexical Substitution0
Large scale analysis of gender bias and sexism in song lyrics0
Large Scale Substitution-based Word Sense Induction0
Large-scale Taxonomy Induction Using Entity and Word Embeddings0
Lasige-BioTM at ProfNER: BiLSTM-CRF and contextual Spanish embeddings for Named Entity Recognition and Tweet Binary Classification0
LaSTUS/TALN at SemEval-2019 Task 6: Identification and Categorization of Offensive Language in Social Media with Attention-based Bi-LSTM model0
Latent Semantic Analysis Approach for Document Summarization Based on Word Embeddings0
Latent Suicide Risk Detection on Microblog via Suicide-Oriented Word Embeddings and Layered Attention0
Latent Topic Embedding0
Show:102550
← PrevPage 264 of 401Next →

No leaderboard results yet.