SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33313340 of 4002 papers

TitleStatusHype
Combining Textual Features for the Detection of Hateful and Offensive LanguageCode0
Discourse Relation Embeddings: Representing the Relations between Discourse Segments in Social MediaCode0
Neural Networks approaches focused on French Spoken Language Understanding: application to the MEDIA Evaluation TaskCode0
Combining Representations For Effective Citation ClassificationCode0
Discovering and Interpreting Biased Concepts in Online CommunitiesCode0
Word2vec to behavior: morphology facilitates the grounding of language in machinesCode0
Combining financial word embeddings and knowledge-based features for financial text summarization UC3M-MC System at FNS-2020Code0
Discovering emergent connections in quantum physics research via dynamic word embeddingsCode0
Towards Multi-Sense Cross-Lingual Alignment of Contextual EmbeddingsCode0
Neural Networks for Open Domain Targeted SentimentCode0
Show:102550
← PrevPage 334 of 401Next →

No leaderboard results yet.