SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12511260 of 4002 papers

TitleStatusHype
SMM4H Shared Task 2020 - A Hybrid Pipeline for Identifying Prescription Drug Abuse from Twitter: Machine Learning, Deep Learning, and Post-Processing0
Graph-based Syntactic Word Embeddings0
UAlberta at SemEval-2020 Task 2: Using Translations to Predict Cross-Lingual Entailment0
Cross-lingual Annotation Projection in Legal TextsCode0
UZH at SemEval-2020 Task 3: Combining BERT with WordNet Sense Embeddings to Predict Graded Word Similarity Changes0
Augmenting NLP models using Latent Feature Interpolations0
TemporalTeller at SemEval-2020 Task 1: Unsupervised Lexical Semantic Change Detection with Temporal Referencing0
Coordination Boundary Identification without Labeled Data for Compound Terms Disambiguation0
Manifold Learning-based Word Representation Refinement Incorporating Global and Local Information0
Contextualized Embeddings for Enriching Linguistic Analyses on Politeness0
Show:102550
← PrevPage 126 of 401Next →

No leaderboard results yet.