SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 25912600 of 4002 papers

TitleStatusHype
Unsupervised Word Translation with Adversarial Autoencoder0
UNT Linguistics at SemEval-2020 Task 12: Linear SVC with Pre-trained Word Embeddings as Document Vectors and Targeted Linguistic Features0
Unveiling the Dreams of Word Embeddings: Towards Language-Driven Image Generation0
UPB at SemEval-2020 Task 9: Identifying Sentiment in Code-Mixed Social Media Texts using Transformers and Multi-Task Learning0
UPB at SemEval-2021 Task 1: Combining Deep Learning and Hand-Crafted Features for Lexical Complexity Prediction0
Upgrading the Newsroom: An Automated Image Selection System for News Articles0
Uppsala University at SemEval-2022 Task 1: Can Foreign Entries Enhance an English Reverse Dictionary?0
UPV-28-UNITO at SemEval-2019 Task 7: Exploiting Post's Nesting and Syntax Information for Rumor Stance Classification0
Urban Dictionary Embeddings for Slang NLP Applications0
USAAR at SemEval-2016 Task 13: Hyponym Endocentricity0
Show:102550
← PrevPage 260 of 401Next →

No leaderboard results yet.