SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13411350 of 4002 papers

TitleStatusHype
A Preliminary Study on a Conceptual Game Feature Generation and Recommendation System0
Enhanced Word Representations for Bridging Anaphora Resolution0
Evaluating Biomedical BERT Models for Vocabulary Alignment at Scale in the UMLS Metathesaurus0
Case Studies on using Natural Language Processing Techniques in Customer Relationship Management Software0
Enhancing Chinese Intent Classification by Dynamically Integrating Character Features into Word Embeddings with Ensemble Techniques0
Enhancing Clinical Concept Extraction with Contextual Embeddings0
Casteism in India, but Not Racism - a Study of Bias in Word Embeddings of Indian Languages0
Enhancing General Sentiment Lexicons for Domain-Specific Use0
Evaluating Feature Extraction Methods for Knowledge-based Biomedical Word Sense Disambiguation0
Detecting Unassimilated Borrowings in Spanish: An Annotated Corpus and Approaches to Modeling0
Show:102550
← PrevPage 135 of 401Next →

No leaderboard results yet.