SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11811190 of 4002 papers

TitleStatusHype
CLaC at SemEval-2020 Task 5: Muli-task Stacked Bi-LSTMs0
CitiusNLP at SemEval-2020 Task 3: Comparing Two Approaches for Word Vector Contextualization0
UZH at SemEval-2020 Task 3: Combining BERT with WordNet Sense Embeddings to Predict Graded Word Similarity Changes0
UNT Linguistics at SemEval-2020 Task 12: Linear SVC with Pre-trained Word Embeddings as Document Vectors and Targeted Linguistic Features0
TUE at SemEval-2020 Task 1: Detecting Semantic Change by Clustering Contextual Word Embeddings0
LT3 at SemEval-2020 Task 7: Comparing Feature-Based and Transformer-Based Approaches to Detect Funny Headlines0
DoTheMath at SemEval-2020 Task 12 : Deep Neural Networks with Self Attention for Arabic Offensive Language Detection0
DiaSense at SemEval-2020 Task 1: Modeling Sense Change via Pre-trained BERT Embeddings0
BhamNLP at SemEval-2020 Task 12: An Ensemble of Different Word Embeddings and Emotion Transfer Learning for Arabic Offensive Language Identification in Social Media0
Cross-lingual Annotation Projection in Legal TextsCode0
Show:102550
← PrevPage 119 of 401Next →

No leaderboard results yet.