SOTAVerified

Learning Contextual Embeddings for Structural Semantic Similarity using Categorical Information

2017-08-01CONLL 2017Unverified0· sign in to hype

Massimo Nicosia, Aless Moschitti, ro

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Tree kernels (TKs) and neural networks are two effective approaches for automatic feature engineering. In this paper, we combine them by modeling context word similarity in semantic TKs. This way, the latter can operate subtree matching by applying neural-based similarity on tree lexical nodes. We study how to learn representations for the words in context such that TKs can exploit more focused information. We found that neural embeddings produced by current methods do not provide a suitable contextual similarity. Thus, we define a new approach based on a Siamese Network, which produces word representations while learning a binary text similarity. We set the latter considering examples in the same category as similar. The experiments on question and sentiment classification show that our semantic TK highly improves previous results.

Tasks

Reproductions