SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 511520 of 4002 papers

TitleStatusHype
A Hmong Corpus with Elaborate Expression Annotations0
A General Framework for Detecting Metaphorical Collocations0
Metaphor Detection for Low Resource Languages: From Zero-Shot to Few-Shot Learning in Middle High GermanCode0
XLNET-GRU Sentiment Regression Model for Cryptocurrency News in English and Malay0
Compiling a Highly Accurate Bilingual Lexicon by Combining Different Approaches0
Automating Idea Unit Segmentation and Alignment for Assessing Reading Comprehension via Summary Protocol Analysis0
Measuring Similarity by Linguistic Features rather than Frequency0
Sentence Selection Strategies for Distilling Word Embeddings from BERT0
BERTrade: Using Contextual Embeddings to Parse Old French0
Evolving Large Text Corpora: Four Versions of the Icelandic Gigaword Corpus0
Show:102550
← PrevPage 52 of 401Next →

No leaderboard results yet.