SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 10211030 of 4002 papers

TitleStatusHype
SINAI at SemEval-2021 Task 5: Combining Embeddings in a BiLSTM-CRF model for Toxic Spans Detection0
CLULEX at SemEval-2021 Task 1: A Simple System Goes a Long Way0
“Are you calling for the vaporizer you ordered?” Combining Search and Prediction to Identify Orders in Contact Centers0
TUDA-CCL at SemEval-2021 Task 1: Using Gradient-boosted Regression Tree Ensembles Trained on a Heterogeneous Feature Set for Predicting Lexical Complexity0
GX at SemEval-2021 Task 2: BERT with Lemma Information for MCL-WiC TaskCode0
Applying Occam’s Razor to Transformer-Based Dependency Parsing: What Works, What Doesn’t, and What is Really Necessary0
RS\_GV at SemEval-2021 Task 1: Sense Relative Lexical Complexity Prediction0
Modeling Text using the Continuous Space Topic Model with Pre-Trained Word Embeddings0
Evaluating a Joint Training Approach for Learning Cross-lingual Embeddings with Sub-word Information without Parallel Corpora on Lower-resource Languages0
Tracking Semantic Change in Cognate Sets for English and Romance Languages0
Show:102550
← PrevPage 103 of 401Next →

No leaderboard results yet.