SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33763400 of 4002 papers

TitleStatusHype
Improving Neural Knowledge Base Completion with Cross-Lingual Projections0
Word Sense Filtering Improves Embedding-Based Lexical Substitution0
Semantic Similarity of Arabic Sentences with Word Embeddings0
Inducing Embeddings for Rare and Unseen Words by Leveraging Lexical Resources0
Integrating Semantic Knowledge into Lexical Embeddings Based on Information Content MeasurementCode0
Learning to Negate Adjectives with Bilinear Models0
Elucidating Conceptual Properties from Word Embeddings0
Modelling metaphor with attribute-based semantics0
Centroid-based Text Summarization through Compositionality of Word EmbeddingsCode0
Nonsymbolic Text Representation0
The Language of Place: Semantic Value from Geospatial Context0
Automatic Argumentative-Zoning Using Word2vecCode0
Diving Deep into Clickbaits: Who Use Them to What Extents in Which Topics with What Effects?0
An embedded segmental K-means model for unsupervised segmentation and clustering of speechCode0
Dynamic Bernoulli Embeddings for Language EvolutionCode0
Story Cloze Ending Selection Baselines and Data Examination0
What can you do with a rock? Affordance extraction via word embeddings0
Unsupervised Learning of Sentence Embeddings using Compositional n-Gram FeaturesCode0
Orthogonalized ALS: A Theoretically Principled Tensor Decomposition Algorithm for Practical Use0
Sound-Word2Vec: Learning Word Representations Grounded in Sounds0
A Comparative Study of Word Embeddings for Reading Comprehension0
Dynamic Word Embeddings for Evolving Semantic DiscoveryCode0
Dynamic Word EmbeddingsCode0
Use Generalized Representations, But Do Not Forget Surface Features0
LTSG: Latent Topical Skip-Gram for Mutually Learning Topic Model and Vector Representations0
Show:102550
← PrevPage 136 of 161Next →

No leaderboard results yet.