SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 31313140 of 4002 papers

TitleStatusHype
Class-based Prediction Errors to Detect Hate Speech with Out-of-vocabulary Words0
Parameter Free Hierarchical Graph-Based Clustering for Analyzing Continuous Word Embeddings0
An Ontology-Based Method for Extracting and Classifying Domain-Specific Compositional Nominal Compounds0
Event Detection Using Frame-Semantic Parser0
Cross-Lingual Classification of Topics in Political Texts0
Clinical Event Detection with Hybrid Neural Architecture0
Representations of Time Expressions for Temporal Relation Extraction with Convolutional Neural Networks0
Metaphor Detection in a Poetry Corpus0
Automated Preamble Detection in Dictated Medical Reports0
Does the Geometry of Word Embeddings Help Document Classification? A Case Study on Persistent Homology-Based Representations0
Show:102550
← PrevPage 314 of 401Next →

No leaderboard results yet.