SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33713380 of 4002 papers

TitleStatusHype
Code-Switched Named Entity Recognition with Embedding Attention0
Code-switching Language Modeling With Bilingual Word Embeddings: A Case Study for Egyptian Arabic-English0
CogALex-V Shared Task: CGSRC - Classifying Semantic Relations using Convolutional Neural Networks0
CogALex-V Shared Task: GHHH - Detecting Semantic Relations via Word Embeddings0
CogALex-V Shared Task: LOPE0
CogniFNN: A Fuzzy Neural Network Framework for Cognitive Word Embedding Evaluation0
CogniVal in Action: An Interface for Customizable Cognitive Word Embedding Evaluation0
CogNLP-Sheffield at CMCL 2021 Shared Task: Blending Cognitively Inspired Features with Transformer-based Language Models for Predicting Eye Tracking Patterns0
Coherence models in schizophrenia0
COIN – an Inexpensive and Strong Baseline for Predicting Out of Vocabulary Word Embeddings0
Show:102550
← PrevPage 338 of 401Next →

No leaderboard results yet.