本文主要借助代码讲解Xavier和kaiming是如何借助_calculate_fan_in_and_fan_out函数来计算当前网络层的fan_in(输入神经元个数)和fan_out(输出神经元个数的),先针对Linear和Conv2d两种。

pytorch系列 ---9的番外, Xavier和kaiming是如何fan_in和fan_out的,_calculate_fan_in_and_fan_out解读 Conv2d

m_c = nn.Conv2d(16, 33, 3, stride=2)
m_l = nn.Linear(1, 10)
m_c.weight.size()
m_l.weight.size()

out:

torch.Size([33, 16, 3, 3])
torch.Size([10, 1])

注意看Linear weight的维度为2,而Conv2d的维度为4.
首先判断tensor的维度,如果是二维,则是Linear,

if dimensions == 2:  # Linear
        fan_in = tensor.size(1)
        fan_out = tensor.size(0)

此时:fan_in=in_channelsfan\_in = in\_channels
fan_out=out_channelsfan\_out = out\_channels

而如果大于2维,Conv2d.weight的第一维为out_channels, 第二维为in_channels第三维和第四维维kernal_size,
代码else,先取出前两个维度,然后tensor[0][0].numel得到的是tensor[0][0]中元素的数目,也就是:w_c.weight.size(2)×w_c.weight.size(3)××w_c.weight.size(1)w\_c.weight.size(2) \times w\_c.weight.size(3)\times {\ldots}_{\rm } \times w\_c.weight.size(-1),在m_c中就是33=93*3=9
再将此值乘以 num_input_fmapsnum_output_fmaps就得到fan_infan_out

fan_in=in_channelskernal_size[0]kernal_size[0]fan\_in = in\_channels * kernal\_size[0]* kernal\_size[0]
fan_out=out_channelskernal_size[0]kernal_size[0]fan\_out = out\_channels * kernal\_size[0]* kernal\_size[0]

else:
        num_input_fmaps = tensor.size(1)
        num_output_fmaps = tensor.size(0)
        receptive_field_size = 1
        if tensor.dim() > 2:
            receptive_field_size = tensor[0][0].numel()
        fan_in = num_input_fmaps * receptive_field_size
        fan_out = num_output_fmaps * receptive_field_size

这是测试代码

m_c = nn.Conv2d(16, 33, 3, stride=2)

m_l = nn.Linear(1, 10)

m_c.weight.size()
Out[30]: torch.Size([33, 16, 3, 3])

m_l.weight.size()
Out[31]: torch.Size([10, 1])

m_c.weight[0][0]
Out[32]: 
tensor([[-0.0667,  0.0241,  0.0701],
        [-0.0209,  0.0364,  0.0826],
        [ 0.0803, -0.0535,  0.0316]], grad_fn=<SelectBackward>)

m_c.weight[0][0].numel()
Out[33]: 9

相关文章:

  • 2021-09-20
  • 2021-07-15
  • 2021-12-20
  • 2021-06-14
  • 2021-08-19
  • 2021-05-22
  • 2022-12-23
猜你喜欢
  • 2022-01-15
  • 2022-01-27
  • 2021-11-20
  • 2021-10-07
  • 2021-11-09
  • 2021-09-01
  • 2022-12-23
相关资源
相似解决方案