SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11911200 of 4002 papers

TitleStatusHype
Disunited Nations? A Multiplex Network Approach to Detecting Preference Affinity Blocs using Texts and Votes0
Diving Deep into Clickbaits: Who Use Them to What Extents in Which Topics with What Effects?0
DLRG@DravidianLangTech-ACL2022: Abusive Comment Detection in Tamil using Multilingual Transformer Models0
DL Team at SemEval-2018 Task 1: Tweet Affect Detection using Sentiment Lexicons and Embeddings0
Development of a Japanese Personality Dictionary based on Psychological Methods0
DNN-Based Semantic Model for Rescoring N-best Speech Recognition List0
Bi-LSTM Neural Networks for Chinese Grammatical Error Diagnosis0
Closed Form Word Embedding Alignment0
Document Embedding for Scientific Articles: Efficacy of Word Embeddings vs TFIDF0
Developing Conversational Data and Detection of Conversational Humor in Telugu0
Show:102550
← PrevPage 120 of 401Next →

No leaderboard results yet.