SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 931940 of 4002 papers

TitleStatusHype
Design and Implementation of a Quantum Kernel for Natural Language ProcessingCode0
Resolving Prepositional Phrase Attachment Ambiguities with Contextualized Word EmbeddingsCode0
Detecting Unassimilated Borrowings in Spanish: An Annotated Corpus and Approaches to ModelingCode0
Cross-lingual Dependency Parsing with Unlabeled Auxiliary LanguagesCode0
Density Matching for Bilingual Word EmbeddingCode0
ReviewViz: Assisting Developers Perform Empirical Study on Energy Consumption Related Reviews for Mobile ApplicationsCode0
CWTM: Leveraging Contextualized Word Embeddings from BERT for Neural Topic ModelingCode0
Dependency Sensitive Convolutional Neural Networks for Modeling Sentences and DocumentsCode0
Cross-lingual Lexical Sememe PredictionCode0
A Comprehensive Comparison of Word Embeddings in Event & Entity Coreference ResolutionCode0
Show:102550
← PrevPage 94 of 401Next →

No leaderboard results yet.