SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12511260 of 4002 papers

TitleStatusHype
An Artificial Language Evaluation of Distributional Semantic Models0
Cross-lingual Word Sense Disambiguation using mBERT Embeddings with Syntactic Dependencies0
Automated Scoring of Clinical Expressive Language Evaluation Tasks0
Cross-Lingual Word Representations: Induction and Evaluation0
Cross-lingual Word Embeddings in Hyperbolic Space0
Automated Preamble Detection in Dictated Medical Reports0
Anaphora Resolution in Dialogue Systems for South Asian Languages0
Adversarial Learning with Contextual Embeddings for Zero-resource Cross-lingual Classification and NER0
A Comparison of Architectures and Pretraining Methods for Contextualized Multilingual Word Embeddings0
Parsimonious Morpheme Segmentation with an Application to Enriching Word Embeddings0
Show:102550
← PrevPage 126 of 401Next →

No leaderboard results yet.