SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11511160 of 4002 papers

TitleStatusHype
Disentangling dialects: a neural approach to Indo-Aryan historical phonology and subgroupingCode0
Deep Image-to-Recipe TranslationCode0
Global Textual Relation Embedding for Relational UnderstandingCode0
DeepHateExplainer: Explainable Hate Speech Detection in Under-resourced Bengali LanguageCode0
Discriminating between Lexico-Semantic Relations with the Specialization Tensor ModelCode0
Distilling Semantic Concept Embeddings from Contrastively Fine-Tuned Language ModelsCode0
Does mBERT understand Romansh? Evaluating word embeddings using word alignmentCode0
E2Vec: Feature Embedding with Temporal Information for Analyzing Student Actions in E-Book SystemsCode0
Identification of Adjective-Noun Neologisms using Pretrained Language ModelsCode0
Embedding Transfer for Low-Resource Medical Named Entity Recognition: A Case Study on Patient MobilityCode0
Show:102550
← PrevPage 116 of 401Next →

No leaderboard results yet.