cogdl.models.nn.gat
¶
Module Contents¶
Classes¶
Special function for only sparse region backpropataion layer. |
|
Sparse version GAT layer, similar to https://arxiv.org/abs/1710.10903 |
|
The GAT model from the `”Graph Attention Networks” |
-
class
cogdl.models.nn.gat.
SpecialSpmmFunction
[source]¶ Bases:
torch.autograd.Function
Special function for only sparse region backpropataion layer.
-
class
cogdl.models.nn.gat.
SpGraphAttentionLayer
(in_features, out_features, dropout, alpha, concat=True)[source]¶ Bases:
torch.nn.Module
Sparse version GAT layer, similar to https://arxiv.org/abs/1710.10903
-
class
cogdl.models.nn.gat.
PetarVSpGAT
(nfeat, nhid, nclass, dropout, alpha, nheads)[source]¶ Bases:
cogdl.models.BaseModel
The GAT model from the “Graph Attention Networks” paper
- Args:
num_features (int) : Number of input features. num_classes (int) : Number of classes. hidden_size (int) : The dimension of node representation. dropout (float) : Dropout rate for model training. alpha (float) : Coefficient of leaky_relu. nheads (int) : Number of attention heads.