# 前言
dropout 常常用于抑制过拟合,pytorch 也提供了很方便的函数。但是经常不知道 dropout 的参数 p
是什么意思。在 TensorFlow 中 p
叫做 keep_prob
,就一直以为 pytorch 中的 p
应该就是保留节点数的比例,但是实验结果发现反了,实际上表示的是不保留节点数的比例。
>>> import torch | |
>>> a = torch.randn(10, 1) | |
>>> a | |
tensor([[ 1.0824], | |
[-0.6219], | |
[-0.3044], | |
[-0.3553], | |
[-0.8303], | |
[-2.1157], | |
[-1.1850], | |
[ 0.3868], | |
[ 0.1184], | |
[-0.8278]]) |
p = 0.5
>>> torch.nn.Dropout(0.5)(a) | |
tensor([[ 2.1647], | |
[-0.0000], | |
[-0.0000], | |
[-0.0000], | |
[-1.6607], | |
[-4.2314], | |
[-2.3699], | |
[ 0.0000], | |
[ 0.2368], | |
[-1.6556]]) |
p = 0
>>> torch.nn.Dropout(0)(a) | |
tensor([[ 1.0824], | |
[-0.6219], | |
[-0.3044], | |
[-0.3553], | |
[-0.8303], | |
[-2.1157], | |
[-1.1850], | |
[ 0.3868], | |
[ 0.1184], | |
[-0.8278]]) |
p = 1
>>> torch.nn.Dropout(1)(a) | |
tensor([[0.], | |
[-0.], | |
[-0.], | |
[-0.], | |
[-0.], | |
[-0.], | |
[-0.], | |
[0.], | |
[0.], | |
[-0.]]) |