SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 38813890 of 4002 papers

TitleStatusHype
Using natural language processing techniques to extract information on the properties and functionalities of energetic materials from large text corporaCode0
GX at SemEval-2021 Task 2: BERT with Lemma Information for MCL-WiC TaskCode0
Analyzing Continuous Semantic Shifts with Diachronic Word Similarity MatricesCode0
MIPA: Mutual Information Based Paraphrase Acquisition via Bilingual PivotingCode0
AutoSUM: Automating Feature Extraction and Multi-user Preference Simulation for Entity SummarizationCode0
Word Ordering as Unsupervised Learning Towards Syntactically Plausible Word RepresentationsCode0
Misspelling Oblivious Word EmbeddingsCode0
Harnessing Cross-lingual Features to Improve Cognate Detection for Low-resource LanguagesCode0
A General Framework for Implicit and Explicit Debiasing of Distributional Word Vector SpacesCode0
Question Embeddings Based on Shannon Entropy: Solving intent classification task in goal-oriented dialogue systemCode0
Show:102550
← PrevPage 389 of 401Next →

No leaderboard results yet.