SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 611620 of 4002 papers

TitleStatusHype
Chinese Hypernym-Hyponym Extraction from User Generated Categories0
CLaC at SemEval-2020 Task 5: Muli-task Stacked Bi-LSTMs0
Clustering is Efficient for Approximate Maximum Inner Product Search0
"A Passage to India": Pre-trained Word Embeddings for Indian Languages0
A Joint Model for Word Embedding and Word Morphology0
``A Passage to India'': Pre-trained Word Embeddings for Indian Languages0
AI-KU at SemEval-2016 Task 11: Word Embeddings and Substring Features for Complex Word Identification0
A Cross-lingual Natural Language Processing Framework for Infodemic Management0
Character-based Neural Machine Translation0
基於相依詞向量的剖析結果重估與排序(N-best Parse Rescoring Based on Dependency-Based Word Embeddings)0
Show:102550
← PrevPage 62 of 401Next →

No leaderboard results yet.