国产精品天干天干,亚洲毛片在线,日韩gay小鲜肉啪啪18禁,女同Gay自慰喷水

歡迎光臨散文網(wǎng) 會員登陸 & 注冊

25篇深度GNN最新研究,多視角方案解讀

2021-12-27 18:32 作者:深度之眼官方賬號  | 我要投稿

本文轉(zhuǎn)載自【AI機器學(xué)習(xí)與知識圖譜】公眾號

在計算機視覺中,模型CNN隨著其層次加深可以學(xué)習(xí)到更深層次的特征信息,疊加64層或128層是十分正常的現(xiàn)象,且能較淺層取得更優(yōu)的效果。

圖卷積神經(jīng)網(wǎng)絡(luò)GCNs是一種針對圖結(jié)構(gòu)數(shù)據(jù)的深度學(xué)習(xí)方法,但是目前大多數(shù)的GCN模型都是淺層的,如GCN,GAT模型都是在2層時取得最優(yōu)效果,隨著加深模型效果就會大幅度下降;GCN隨著模型層次加深會出現(xiàn)Over-Smoothing問題,Over-Smoothing既相鄰的節(jié)點隨著網(wǎng)絡(luò)變深就會越來越相似,最后學(xué)習(xí)到的nodeembedding便無法區(qū)分,模型效果下降。

為什么要將GNN做深,DeeperGNN適用于解決什么問題:少標(biāo)簽半監(jiān)督節(jié)點分類;少特征半監(jiān)督節(jié)點分類。下面再給出幾個概念解釋。

1. Over-fitting:在CNN卷積神經(jīng)網(wǎng)絡(luò)中,若CNN網(wǎng)絡(luò)結(jié)構(gòu)過于復(fù)雜過于Deep,且數(shù)據(jù)量有限的情況下,便會出現(xiàn)Over-fitting問題,Over-fitting就是指模型對于訓(xùn)練數(shù)據(jù)過度學(xué)習(xí),學(xué)習(xí)到訓(xùn)練數(shù)據(jù)本身而不是訓(xùn)練數(shù)據(jù)的規(guī)律,導(dǎo)致無法在測試集上準(zhǔn)確預(yù)測的情況。

2. Over-Smoothing:在GNN圖神經(jīng)網(wǎng)絡(luò)中,由于圖本身結(jié)構(gòu)上節(jié)點與節(jié)點之間相互連接的特性,并且圖神經(jīng)網(wǎng)絡(luò)一般是通過鄰域匯聚或隨機游走的方式進行表征學(xué)習(xí),因此當(dāng)圖網(wǎng)絡(luò)一旦變深,便會出現(xiàn)Over-Smoothing問題,Over-Smoothing指的是隨著圖神經(jīng)網(wǎng)絡(luò)加深,學(xué)習(xí)到的節(jié)點表征越來越相似,以至于無法區(qū)分,模型效果也將大幅下降。且在圖網(wǎng)絡(luò)中一般2 Layers時效果最佳。因此如何在DeepGNN中既能學(xué)到更深層次信息又能避免Over-Smoothing顯得至關(guān)重要。


▌必讀系列4篇

GCNII】Simple and Deep Graph Convolutional Networks [ICML 2020]?

GRAND】Graph Random Neural Networks for Semi-Supervised Learning on Graphs [NeurIPS 2020]

DAGNN】Towards Deeper Graph Neural Networks [KDD 2020]

APPNP】Predict then Propagate: Graph Neural Networks meet Personalized PageRank [ICLR 2019]


▌Guohao Li系列3篇

Guohao Li一直在跟進深度GNN的研究,必讀。

【Guohao Li】個人首頁:https://ghli.org/【Guohao Li】

DeepGCNs: Can GCNs Go as Deep as CNNs? [ICCV 2019]

【Guohao Li】DeeperGCN: All You Need to Train Deeper GCNs [arXiv 2020]

【Guohao Li】Training Graph Neural Networks with 1000 Layers [ICML 2021]


▌2021年最新4篇推薦

Adaptive Universal Generalized PageRank Graph Neural Network [ICLR 2021]

Graph Neural Networks Inspired by Classical Iterative Algorithms [ICML 2021]

AdaGCN: Adaboosting Graph Convolutional Networks into Deep Models [ICLR 2021]

Adaptive Universal Generalized PageRank Graph Neural Network [ICLR 2021]


▌2020年10篇推薦

【DropEdge】Towards Deep Graph Convolutional Networks on Node Classification [ICLR 2020]

【PairNorm】Tackling Oversmoothing in GNNs [ICLR 2020]

Towards Deeper Graph Neural Networks with Differentiable Group Normalization [NeurIPS 2020]

Scattering GCN: Overcoming Oversmoothness in Graph Convolutional Networks [NeurIPS 2020]

Bayesian Graph Neural Networks with Adaptive Connection Sampling [ICML 2020]

Continuous Graph Neural Networks [ICML 2020]

Graph Neural Networks Exponentially Lose Expressive Power for Node Classification [ICLR 2020]

Measuring and Improving the Use of Graph Information in Graph Neural Networks [ICLR 2020]

Measuring and Relieving the Over-smoothing Problem for Graph Neural Networks from the Topological View [AAAI 2020]

【JK-Net】Representation Learning on Graphs with Jumping Knowledge Networks [ICML 2018]


▌其他4篇

Deep Graph Neural Networks with Shallow Subgraph Samplers [arXiv 2020]

Tackling Over-Smoothing for General Graph Convolutional Networks [arXiv 2020]

Effective Training Strategies for Deep Graph Neural Networks [arXiv 2020]

Revisiting Over-smoothing in Deep GCNs [arXiv 2020]


關(guān)注【學(xué)姐帶你玩AI】

論文推薦應(yīng)有盡有


25篇深度GNN最新研究,多視角方案解讀的評論 (共 條)

分享到微博請遵守國家法律
弋阳县| 云林县| 临邑县| 韩城市| 荔波县| 周至县| 美姑县| 循化| 镇宁| 五原县| 建宁县| 墨江| 中牟县| 德化县| 宁晋县| 武强县| 类乌齐县| 东平县| 岱山县| 洛浦县| 咸丰县| 通州区| 钟山县| 伽师县| 合山市| 且末县| 唐河县| 漯河市| 大姚县| 绥江县| 普格县| 金沙县| 昌黎县| 武清区| 崇文区| 左贡县| 保德县| 武威市| 江安县| 博客| 华蓥市|