SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 581590 of 4002 papers

TitleStatusHype
Automatic Extraction of Nested Entities in Clinical Referrals in SpanishCode0
"This is my unicorn, Fluffy": Personalizing frozen vision-language representationsCode1
A Part-of-Speech Tagger for YiddishCode0
A bilingual approach to specialised adjectives through word embeddings in the karstology domain0
Asymmetric Proxy Loss for Multi-View Acoustic Word Embeddings0
Detecting Unassimilated Borrowings in Spanish: An Annotated Corpus and Approaches to ModelingCode0
Semantic properties of English nominal pluralization: Insights from word embeddings0
An Evaluation Dataset for Legal Word Embedding: A Case Study On Chinese CodexCode0
Comparing in context: Improving cosine similarity measures with a metric tensor0
Isomorphic Cross-lingual Embeddings for Low-Resource Languages0
Show:102550
← PrevPage 59 of 401Next →

No leaderboard results yet.