SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 36013625 of 4002 papers

TitleStatusHype
An Efficient Cross-lingual Model for Sentence Classification Using Convolutional Neural Network0
Neural Networks For Negation Scope Detection0
Word Embedding Calculus in Meaningful Ultradense Subspaces0
How to Train good Word Embeddings for Biomedical NLPCode0
Word embeddings and discourse information for Quality Estimation0
A Latent Concept Topic Model for Robust Topic Inference Using Word Embeddings0
SHEF-LIUM-NN: Sentence level Quality Estimation with Neural Network Features0
Word Embeddings with Limited Memory0
Implicit Discourse Relation Detection via a Deep Architecture with Gated Relevance Network0
Semantics-Driven Recognition of Collocations Using Word Embeddings0
Incorporating Relational Knowledge into Word Representations using Subspace Regularization0
Improved Semantic Representation for Domain-Specific Entities0
Intrinsic Evaluations of Word Embeddings: What Can We Do Better?0
Investigating Language Universal and Specific Properties in Word Embeddings0
Is ``Universal Syntax'' Universally Useful for Learning Distributed Word Representations?0
Jointly Learning to Embed and Predict with Multiple Languages0
Sentence Embedding Evaluation Using Pyramid Annotation0
Visual Relationship Detection with Language Priors0
Cseq2seq: Cyclic Sequence-to-Sequence Learning0
Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word EmbeddingsCode0
An Empirical Evaluation of doc2vec with Practical Insights into Document Embedding GenerationCode0
Language classification from bilingual word embedding graphs0
Enriching Word Vectors with Subword InformationCode0
The Benefits of Word Embeddings Features for Active Learning in Clinical Information Extraction0
Mapping distributional to model-theoretic semantic spaces: a baselineCode0
Show:102550
← PrevPage 145 of 161Next →

No leaderboard results yet.