SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 641650 of 4002 papers

TitleStatusHype
BRUMS at SemEval-2020 Task 3: Contextualised Embeddings for Predicting the (Graded) Effect of Context in Word SimilarityCode0
Dynamic Word Embeddings for Evolving Semantic DiscoveryCode0
Bilingual Lexicon Induction through Unsupervised Machine TranslationCode0
DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment AnalysisCode0
Debiasing Convolutional Neural Networks via Meta OrthogonalizationCode0
Building a Kannada POS Tagger Using Machine Learning and Neural Network ModelsCode0
Aggressive Language Identification Using Word Embeddings and Sentiment FeaturesCode0
Bilingual Learning of Multi-sense Embeddings with Discrete AutoencodersCode0
Application of a Hybrid Bi-LSTM-CRF model to the task of Russian Named Entity RecognitionCode0
Debiasing Multilingual Word Embeddings: A Case Study of Three Indian LanguagesCode0
Show:102550
← PrevPage 65 of 401Next →

No leaderboard results yet.