SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 981990 of 4002 papers

TitleStatusHype
A Deep Learning System for Automatic Extraction of Typological Linguistic Information from Descriptive Grammars0
Position Masking for Improved Layout-Aware Document Understanding0
Comparing Contextual and Static Word Embeddings with Small Data0
The Impact of Word Embeddings on Neural Dependency Parsing0
Word Discriminations for Vocabulary Inventory PredictionCode0
Abstractive Document Summarization with Word Embedding Reconstruction0
Bilingual Terminology Extraction Using Neural Word Embeddings on Comparable Corpora0
Siamese Networks for Inference in Malayalam Language Texts0
Sense representations for Portuguese: experiments with sense embeddings and deep neural language models0
Effectiveness of Deep Networks in NLP using BiDAF as an example architecture0
Show:102550
← PrevPage 99 of 401Next →

No leaderboard results yet.