SOTAVerified

LIME: Low-Cost and Incremental Learning for Dynamic Heterogeneous Information Networks

2021-02-11IEEE Transactions on Computers 2021Code Available0· sign in to hype

Hao Peng; Renyu Yang; Zheng Wang; Jianxin Li; Lifang He; Philip S. Yu; Albert Y. Zomaya; Rajiv Ranjan

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Understanding the interconnected relationships of large-scale information networks like social, scholar and Internet of Things networks is vital for tasks like recommendation and fraud detection. The vast majority of the real-world networks are inherently heterogeneous and dynamic, containing many different types of nodes and edges and can change drastically over time. The dynamicity and heterogeneity make it extremely challenging to reason about the network structure. Unfortunately, existing approaches are inadequate in modeling real-life dynamical networks as they either have strong assumption of a given stochastic process or fail to capture the heterogeneity of network structure, and they all require extensive computational resources. We introduce Lime , a better approach for modeling dynamic and heterogeneous information networks. Lime is designed to extract high-quality network representation with significantly lower memory resources and computational time over the state-of-the-arts. Unlike prior work that uses a vector to encode each network node, we exploit the semantic relationships among network nodes to encode multiple nodes with similar semantics in shared vectors. By using many fewer node vectors, our approach significantly reduces the required memory space for encoding large-scale networks. To effectively trade information sharing for reduced memory footprint, we employ the recursive neural network (RsNN) with carefully designed optimization strategies to explore the node semantics in a novel cuboid space. We then go further by showing, for the first time, how an effective incremental learning approach can be developed – with the help of RsNN, our cuboid structure, and a set of novel optimization techniques – to allow a learning framework to quickly and efficiently adapt to a constantly evolving network. We evaluate Lime by applying it to three representative network-based tasks, node classification, node clustering and anomaly detection, performin...

Tasks

Reproductions