SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14811490 of 4002 papers

TitleStatusHype
Contextualized Spoken Word Representations from Convolutional Autoencoders0
Reflection-based Word Attribute TransferCode0
Tweets Sentiment Analysis via Word Embeddings and Machine Learning Techniques0
Explainable Depression Detection with Multi-Modalities Using a Hybrid Deep Learning Model on Social Media0
Automated Scoring of Clinical Expressive Language Evaluation Tasks0
Transition-based Semantic Dependency Parsing with Pointer Networks0
Adversarial Evaluation of BERT for Biomedical Named Entity Recognition0
Analyzing the Framing of 2020 Presidential Candidates in the News0
COVID-19 and Arabic Twitter: How can Arab World Governments and Public Health Organizations Learn from Social Media?0
CopyBERT: A Unified Approach to Question Generation with Self-Attention0
Show:102550
← PrevPage 149 of 401Next →

No leaderboard results yet.