SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 681690 of 4002 papers

TitleStatusHype
How direct is the link between words and images?0
Using BERT Embeddings to Model Word Importance in Conversational Transcripts for Deaf and Hard of Hearing Users0
niksss at HinglishEval: Language-agnostic BERT-based Contextual Embeddings with Catboost for Quality Evaluation of the Low-Resource Synthetically Generated Code-Mixed Hinglish TextCode0
Language with Vision: a Study on Grounded Word and Sentence EmbeddingsCode0
JU_NLP at HinglishEval: Quality Evaluation of the Low-Resource Code-Mixed Hinglish Text0
TransDrift: Modeling Word-Embedding Drift using Transformer0
Contextualization and Generalization in Entity and Relation Extraction0
HICEM: A High-Coverage Emotion Model for Artificial Emotional Intelligence0
Transition-based Abstract Meaning Representation Parsing with Contextual Embeddings0
1Cademy at Semeval-2022 Task 1: Investigating the Effectiveness of Multilingual, Multitask, and Language-Agnostic Tricks for the Reverse Dictionary Task0
Show:102550
← PrevPage 69 of 401Next →

No leaderboard results yet.