SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 761770 of 4002 papers

TitleStatusHype
Co-learning of Word Representations and Morpheme Representations0
A semi-supervised model for Persian rumor verification based on content information0
A Semi-universal Pipelined Approach to the CoNLL 2017 UD Shared Task0
A Model of Zero-Shot Learning of Spoken Language Understanding0
Combination of Domain Knowledge and Deep Learning for Sentiment Analysis of Short and Informal Messages on Social Media0
Combining Acoustics, Content and Interaction Features to Find Hot Spots in Meetings0
Combining BERT with Static Word Embeddings for Categorizing Social Media0
Combining Character and Word Embeddings for the Detection of Offensive Language in Arabic0
Combining Contrastive Learning and Knowledge Graph Embeddings to develop medical word embeddings for the Italian language0
A House United: Bridging the Script and Lexical Barrier between Hindi and Urdu0
Show:102550
← PrevPage 77 of 401Next →

No leaderboard results yet.