SOTAVerified

Network Embedding

Network Embedding, also known as "Network Representation Learning", is a collective term for techniques for mapping graph nodes to vectors of real numbers in a multidimensional space. To be useful, a good embedding should preserve the structure of the graph. The vectors can then be used as input to various network and graph analysis tasks, such as link prediction

Source: Tutorial on NLP-Inspired Network Embedding

Papers

Showing 251300 of 403 papers

TitleStatusHype
Network2Vec Learning Node Representation Based on Space Mapping in Networks0
HiGitClass: Keyword-Driven Hierarchical Classification of GitHub RepositoriesCode1
Tutorial on NLP-Inspired Network Embedding0
RiWalk: Fast Structural Node Embedding via Role IdentificationCode0
Dynamic Embedding on Textual Networks via a Gaussian ProcessCode0
Improving Textual Network Learning with Variational Homophilic EmbeddingsCode0
Neural Embedding Propagation on Heterogeneous NetworksCode0
Multi-scale Attributed Node EmbeddingCode0
Selective Network Discovery via Deep Reinforcement Learning on Embedded Spaces0
Temporal Network Embedding with Micro- and Macro-dynamicsCode0
HeteSpaceyWalk: A Heterogeneous Spacey Random Walk for Heterogeneous Information Network Embedding0
Fast and Accurate Network Embeddings via Very Sparse Random ProjectionCode1
Adversarial Training Methods for Network EmbeddingCode1
Initialization for Network Embedding: A Graph Partition Approach0
NETR-Tree: An Eifficient Framework for Social-Based Time-Aware Spatial Keyword Query0
LEAP nets for power grid perturbationsCode0
On Proximity and Structural Role-based Embeddings in Networks: Misconceptions, Techniques, and Applications0
AHINE: Adaptive Heterogeneous Information Network Embedding0
MEGAN: A Generative Adversarial Network for Multi-View Network Embedding0
HONEM: Learning Embedding for Higher Order Networks0
Domain-adversarial Network AlignmentCode0
Deep Hashing for Signed Social Network Embedding0
ProNE: Fast and Scalable Network Representation LearningCode0
DynWalks: Global Topology and Recent Changes Awareness Dynamic Network EmbeddingCode1
IntentGC: a Scalable Graph Convolution Framework Fusing Heterogeneous Information for RecommendationCode0
Improving Skip-Gram based Graph Embeddings via Centrality-Weighted Sampling0
PathRank: A Multi-Task Learning Framework to Rank Paths in Spatial Networks0
Graph Representation Learning via Hard and Channel-Wise Attention NetworksCode0
Network Embedding: on Compression and LearningCode0
Signed Graph Attention NetworksCode1
NetSMF: Large-Scale Network Embedding as Sparse Matrix FactorizationCode0
Dynamic Network Embeddings for Network Evolution Analysis0
ANAE: Learning Node Context Representation for Attributed Network Embedding0
DISCO: Influence Maximization Meets Network Embedding and Deep Learning0
Network Representation of Large-Scale Heterogeneous RNA Sequences with Integration of Diverse Multi-omics, Interactions, and Annotations Data0
Homogeneous Network Embedding for Massive Graphs via Reweighted Personalized PageRank0
Embedding Biomedical Ontologies by Jointly Encoding Network Structure and Textual Node DescriptorsCode0
Dynamic Network Embedding via Incremental Skip-gram with Negative SamplingCode0
Improving Textual Network Embedding with Global Attention via Optimal Transport0
Task-Guided Pair Embedding in Heterogeneous NetworkCode0
DANE: Domain Adaptive Network EmbeddingCode1
Is a Single Vector Enough? Exploring Node Polysemy for Network EmbeddingCode0
Relation Structure-Aware Heterogeneous Information Network Embedding0
ActiveHNE: Active Heterogeneous Network Embedding0
Representation Learning for Attributed Multiplex Heterogeneous NetworkCode1
Physiological Signal Embeddings (PHASE) via Interpretable Stacked Models0
ExplaiNE: An Approach for Explaining Network Embedding-based Link Predictions0
Tag2Vec: Learning Tag Representations in Tag Networks0
Compositional Network Embedding0
Data driven approximation of parametrized PDEs by Reduced Basis and Neural NetworksCode0
Show:102550
← PrevPage 6 of 9Next →

No leaderboard results yet.