SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 31213130 of 4002 papers

TitleStatusHype
Cross-lingual Models of Word Embeddings: An Empirical ComparisonCode0
A Resource-Light Method for Cross-Lingual Semantic Textual SimilarityCode0
Multilingual Relation Extraction using Compositional Universal SchemaCode0
Structured Embedding Models for Grouped DataCode0
Contributions to Clinical Named Entity Recognition in PortugueseCode0
Multilingual Semantic Parsing And Code-SwitchingCode0
Representation Degeneration Problem in Training Natural Language Generation ModelsCode0
word2ket: Space-efficient Word Embeddings inspired by Quantum EntanglementCode0
Representation learning for very short texts using weighted word embedding aggregationCode0
Breaking Free Transformer Models: Task-specific Context Attribution Promises Improved Generalizability Without Fine-tuning Pre-trained LLMsCode0
Show:102550
← PrevPage 313 of 401Next →

No leaderboard results yet.